This research aimed to develop a software framework to study and optimize mapping strategies for complex data presented with auditory displays with high spatial resolution, such as wave-field synthesis and higher-order ambisonics systems. Our wave field synthesis system, the Collaborative-Research Augmented Immersive Virtual Environment Laboratory (CRAIVE-Lab), has a 128 full-range loudspeaker system along the circumference of the lab. We decided to use available software music synthesizers because they are built for excellent sound quality, and much knowledge exists on how to program analog synthesizers and other common methods for a desired sound output. At the scale of 128 channels, feeding 128 synthesizers with complex data was not practical for us for initial explorations because of computational resources and the complexity of data flow infrastructure. The proposed framework was programmed in Matlab, using Weather data from the NOAA database for initial exploration. Data is processed from 128 weather stations from East to West in the US spatially aligned with latitude. A MIDI script, in sequential order for all 128 channels, is compiled from converted weather parameters like temperature, precipitation amount, humidity, and wind speed. The MIDI file is then imported into Reaper to render a single sound file using software synthesizers that are operated with the MIDI file control data instructions. The rendered file is automatically cut into the 128 channels in Matlab and reimported into Reaper for audio playback.