PDA

View Full Version : Concept: Sound in Animation



MarkAH
03-22-2019, 11:33 AM
While on the surface it seems apparent that sound and animated images are complimentary, the critical nature of the relationship is often overlooked.
Taking visual art into animation introduces the realm of time, without which, sound does not exist.
But with time the presence of sound is inevitable.
With the increasing emphasis placed on realism the relationship becomes ever more intimate.
Accurate correlation ever more important.
Sound effects and music are considered by some industry professionals to carry a major, if not prominant, role in the success of animated productions.
Because powerful tools like Lightwave 3D are available to independent artists, the development of scenes, even full features, become viable for smaller working groups and indivuals as well.
The roles classically divided among specialists can be more efficiently combined in these kinds of working groups.
Sound design tools have also become ever more accessible to individuals.
The combination of virtual analog and granular synthesis realize a highly capable toolkit for the creation of sound effects.
And sampling synthesizers can be used to create beautifully realistic musical pieces.
MIDI composition makes it possible to perfectly synchronize a score with the action in an animation.
The resultant power of expression possible far surpassing the capability of even the most recent library based 'fit to' apps.
The acquisition of skills with these tools by model designers and animators can be an investment that vastly increases the market value of the artist.

Bernie2Strokes
03-22-2019, 08:02 PM
I have asked here if Lightwave's animations can be synchronized with sound files. I can't remember if someone mentioned a plugin that can animate a sound wave pattern. But I do support more development in sound development for Lightwave.

MarkAH
03-23-2019, 07:49 AM
The process works best with multiple tools.
Sound editing inside Lightwave is not likely to happen.
Planning is paramount.
Keeping track of the frames where key motion events occur is necessary.
For audio tracks in video editors markers are used to position sound effects sections.
In MIDI scoring the tempo setting is used to make the compositions for sections of animation synchronous to the rates of motion.
To correlate composition for animation, the animation can be designed to convey moods and emotions.
Like, lethargic, subdued, spirited, vivacious, passionate, agitated, brusque, furious, or frenetic.
Then the musical piece can be written to express the same things in synchronicity.

In some cases it might be best for an animator to team with a model designer and sound designer.

MonroePoteet
03-23-2019, 10:14 AM
Yes, sound / music adds an IMMENSE amount of "character" to an animation, as with any type of movie animated or not. I watch the "Special Features" on a variety of DVDs and Blu-ray discs, and if I'm lucky they include sound design, Foley work, and music / orchestration featurettes which are fascinating and invaluable. As a very simple example, here's an ocean surf scene generated in LW2019 but with no sound:

144533 No Sound

It has a certain "CGI fakeness" to it, even disregarding the hard border of the square displaced geometry.

Now, add some sound:

144534 With surf sound

and the perceived realism goes up a notch or two, especially if such an animation is just used as a background plate.

Based upon my very amateur work in this realm, there's several basic approaches I've taken over the years:


I write a piece of music on a MIDI synth and then try to build an animation to visualize it, using MIDI events to synchronize with the music
Someone else composes music with no MIDI available and I try to visualize it, synchronizing with the music
An animation already exists (either I created it or someone else) and needs sound effects (e.g. Foley work), synchonizing with the visuals


Although there's an Audio Channel Modifier in the Graph Editor, it's really not very valuable since it does no frequency analysis or other DSP (digital signal processing) of the audio. To make synch with an audio file with no MIDI stream available easier or even automatic, LW really needs at least a "band-pass filter" and a "limiter-expander" to allow specific frequencies or dB peaks (or a combination of both) in the audio to be separated out and attached as a modifier to specific LW channels.

mTp

MarkAH
03-24-2019, 08:24 AM
Now there's some fun, music to animation!
Takes me way....back, to days when I built light shows driven by music.
Same principles you state. Only with all analog circuits.

It seems like there is no MIDI capability in LW.
I wrote this python script that reads MIDI music files and animates a piano I modeled in Modeler.
The piano is fully rigged, everything that moves has bones, lids, pedals, all 88 keys.

You could take parts of this, using some as is, and other parts could be adapted to different things in a scene to animate them.
Just open the python console to load the script and run it.
That is, it's not a plug-in.
But it could be extended to work with Genoma and other things.

MarkAH
03-24-2019, 06:36 PM
If you'd like to play with the Python, as is, here's the piano.
MIDI Pianimate_v2 really is a python file. Just change the extention to .py

MonroePoteet
03-27-2019, 08:21 AM
Finally got a chance to download the Python script and the sample scene / object. Nice piano model! Unfortunately, the MIDI files I have there are multiple Tracks and Channels, so the "default" MIDI channel used by the script shows the piano playing WAY DOWN in the bass, but still VERY cool!

Nice job!

I did a similar thing back in 2005, writing a C++ plug-in to parse the MIDI file and which had a number of loadable DLLs to operate different types of instruments, such as a keyboard (in my case a pipe organ), hammer / mallet based instruments (drums, marimba), a simplistic guitar strum and chording model and clarinet fingering model (neither being real-world accurate!).

Here's an extract from a REALLY cheesy animation I did back then called "Band Dance" using that plug-in to automatically operate all the instruments (except world movement of the guitar and clarinet):

144588

My plug-in was very dependent on the loadable DLLs, so was not able to be generalized. In my "spare time" I've been SLOWLY working on trying to transform it into a generic ChannelHander and NodeHandler plug-in, but Real Life got in the way.

mTp

MarkAH
03-27-2019, 09:12 AM
LOL, great fun! I like it. Reminds me of the caliopies that were once on the boardwalks.
Multi track no doubt.
I've got a bunch of instruments modeled. Pretty realistic. Next will be Soprano Sax.
Accurate as I can get it. Major challenge. The works on wind instruments are very complex.

Yes, the Pianimate script wants a piano track. It's just a toy but was fun to make.

Gonna have to look into the NodeHandler. Don't know what that means exactly.
Nodes in LW?

I posted in the Python Forum section. Ideas in progress.
Fortunately, this stuff IS my real life.

MonroePoteet
03-27-2019, 10:26 AM
Excellent project modeling musical instruments realistically! Yes, the clarinet model I made for the animation was very exacting and after a while I gave up trying to make it TOO accurate.

Even if the script you posted was "just a toy" it does the fundamentals of understanding the structure of the MIDI file, data representation, etc. I'm trying to build a general purpose plugin which handles not just the KEYON / KEYOFF events but pitch bend, after touch, other controller data, multiple MIDI channels, etc. and allow the user to specify the MIDI data they want to use in LW.

In V9 of LW, a Surfaces node editor was added and various others have been added since (e.g. Nodal Motion, Nodal Displacement, nodal Edges control, etc.). Node (and Channel) handlers in LW provide the user the ability to control LW item channels (e.g. X,Y,Z positiong, H,P,B rotation, X,Y,Z scale) or colors, etc. using scalars, vectors, etc. A node handler is a "Class" of plugins available for implementation in LW.

Plugin development is relatively complex, but if you're interested, here's the top-level reference for plugins developed in Python for LW2018:

http://static.lightwave3d.com/sdk/2018/python/anatomy.html

I program in C++, so it's a different set of SDK (Software Development Kit) documentation than that for Python, but the Python document recommends understanding the general plugin architecture. The LW SDK is available for download here:

https://www.lightwave3d.com/lightwave_sdk/


mTp

MarkAH
03-28-2019, 05:43 AM
OK, the LW Nodes. The Python SDK for nodes looks pretty raw.
I have all the Python SDK, and the C type (lwsdk.**) is also needed to complete the funtionality.
The Python docs aren't really clear on that but I figured it out.
You can see some of that in the Pianimate script.
'import lwsdk' at the top, and the BuildAnim() class where lwsdk.LWEnvelopeFuncs() and lwsdk.LWChannelInfo() classes are instantiated.
Without which the script could not be setting keyframes on the bones.

Whenever I get a good solid concept for a widely applicable plug-in I'll give it a go.
That stuff is posted in the Python thread.

The script for the piano is so specialized it wouldn't make a sensible plug in.
Working on another right now for a second demonstration of playable instruments.

MarkAH
03-28-2019, 02:34 PM
Here is another playable instrument demo.
This odd folk instrument, called Hurdy Gurdy includes an animation script and a MIDI file for the script to load.
An mp3 sound file of the MIDI song is included, to add to the Scene Editor.
It has drone strings and melody strings that are played by keys.
When melody notes are played (Cn3 to En5) Mr. Hurdy presses the keys.
When drone notes are played (Cn2, Dn2, Gn2 and An2) Mr. Gurdy turns the crank.
All other notes are ignored. Track and channel data are ignored.
Modeling, Graphics, Scripting, Sound Design and Music by MarkAH

144614

TheLexx
03-28-2019, 02:55 PM
If I understand the thread correctly, I believe LW plugin SoundWave (http://walen.se/soundwave.html)may be of interest (haven't used it personally). Seems related to auto-lipsync software but with different frequency considerations. :)

MarkAH
03-30-2019, 10:01 AM
That was a pretty cool idea.
But not as usefull now.
Layout supports a single audio file in a scene, which is not what could be used as a 'track'.
Video editing is essentially non existant, so the production of a good quality animation, for practical purposes, will require a video editor.
Good video editors can support still image sequences in mulitple individual tracks, and multiple audio tracks.
This is essential for building quality animations.
The best way to render an animated scene in Layout is by frame sequence.
This offers the maximum possible options in final formats.
Multiple sound effect clips could be required for each scene.
Naratives and musical backgrounds might be added.
There are now many DSP plug-ins available for the video editors with parameters that can be key-framed to simulate motion, ambience, and masking effects.
For example the effect of motion called dopler effect, can be created by key-framing a frequency shift effect.
Creative animated effects for video and audio are also available during video editing.
And the really happy thing is that they aren't too costly.