I have contributed to two papers presented at the Sound and Music Computing Conference that’s currently taking place in Copenhagen. Today I learned that both papers are among the ten papers nominated as best papers of the conference.
SpatDIF: Principles, specification, and examples
by Nils Peters , Jan Schacher & Trond Lossius
SpatDIF, the Spatial Sound Description Interchange Format, is an ongoing collaborative effort offering a semantic and syntactic specification for storing and transmitting spatial audio scene descriptions. The SpatDIF core is a lightweight minimal solution providing the most essential set of descriptors for spatial sound scenes. Additional descriptors are introduced as extensions, expanding the namespace and scope with respect to authoring, scene description, rendering and reproduction of spatial audio. A general overview of the specification is provided, and two use cases are discussed, exemplifying SpatDIF’s potential for file-based pieces as well as real-time streaming of spatial audio information.
An Automated Testing Suite for Computer Music Environments
by Nils Peters , Trond Lossius & Tim Place
Software development benefits from systematic testing with respect to implementation, optimization, and maintenance. Automated testing makes it easy to execute a large number of tests efficiently on a regular basis, leading to faster development and more reliable software. Systematic testing is not widely adopted within the computer music community, where software patches tend to be continuously modified and optimized during a project. Consequently, bugs are often discovered during rehearsal or performance, resulting in literal “show stoppers”. This paper presents a testing environment for computer music systems, first developed for the Jamoma framework and Max. The testing environment works with Max 5 and 6, is independ from any 3rd-party objects, and can be used with non-Jamoma patches as well.
Here’s a sneak peek at a new feature that I am currently working on for Jamoma: Support for dataspaces.
In this example the parameter myPos is used to keep track of the position of a point in space. The parameter is using the position dataspace. By default position will be described using Cartesian coordinates (xyz), but spherical coordinates (add) can be used as an alternative way of describing position.
When ramping, the ramp will happen according to the dataspace used to describe the target value. This way the point can move along straight lines as well as e.g. perform spiraling movements.
I’m currently in the process of testing to ensure that I have everything working, and I expect this feature to be included with the next installer.
ml.* is a machine learning toolkit for Max 5+ by Benjamin Day Smith, and apparently the only implementations of Adaptive Resonance Theory, Self-Organizing Maps, and Spatial Encoding neural nets for Max. Currently for Max OS X only, with Windows builds are forthcoming. ml.* was presented at NIME 2012, and the paper should become available when the proceedings gets online.
I recently was asked for suggestions for how to create slow interpolations between noisy matrixes in Jitter. The problem when interpolating between noisy matrixes is that the result will tend to be grayed out, with less contrasts and extreme pixel values than the two matrixes that we start out with.
In this patch I’m applying some statistics to compensate for the lack of contrast in the interpolated matrixes. All processing is done on matrixes of @type float32. First I find the mean value of the matrix, and change it to become 0.0. Next I calculate the standard deviation. This is used to normalize the the matrix to have a standard deviation of 0.28871. From some empiric measurements of a large noisy matrix, that seemed to be a common value for the standard deviation. Finally I up the mean value of the matrix to 0.5 again.
The patch can be downloaded here.
Cleaned-up versions of the Max/Jamoma patches used for the workshops at Art.On.Wires this week are now available for download.