PDA

View Full Version : Midi plugin for Lightwave 64 bit?



stevecullum
03-18-2015, 07:00 AM
I recently uploaded a track to soundcloud, (link below if you want to listen) and thought it could be cool to use Lightwave for some animations and effects to go with it. Ideally I would want to use the midi files used in the track to drive animation channels etc... Does anyone know of a 64 bit plugin that would do this?

TRACK LINK (https://soundcloud.com/steve_c_02/awakening?utm_source=soundcloud&utm_campaign=share&utm_medium=email)

Thanks!

ernpchan
03-18-2015, 01:51 PM
There's the Audio Channel Modifier in the Graph Editor. Not sure that's exactly what you're looking for.

stoecklem
03-18-2015, 03:50 PM
I've posted this feature request numerous times. Midi works great for controlling 3-D , but no luck.

stevecullum
03-18-2015, 06:11 PM
The audio plugin isn't going give me what I'm after unfortunately. Might have another look at the Python sdk. There is a midi Python library, so maybe I can use that do help construct something. I'm thinking the note velocity information could be used as some kind of multiplier in a nodal network.

MonroePoteet
03-18-2015, 09:03 PM
I wrote a MIDI plugin years ago, probably LW 5.5. I'll see if I can get it modified for the newer LW SDK and compiled 64-bit. I only have a Windows 64-bit compiler, though.

mTp

stevecullum
03-19-2015, 02:01 PM
Hey that would be great! 64bit is all I have installed, so works for me :)

MonroePoteet
03-19-2015, 03:50 PM
I pulled my old source code from backups to evaluate the project, and it'll be a while before it's ready. I'd forgotten what I did back then, and I had the MIDI driver functionality tightly integrated with the modeled instruments (simplistic pipe organ, marimba, guitar, drum set and clarinet), so I'll need to extract the basic MIDI code, modify it to use the newer SDK, clean up the documentation, etc. I have another project for the next week or two, so don't hold your breath! Hopefully, I'll get it done in a few weeks, but I can't promise.

mTp

stevecullum
03-19-2015, 04:22 PM
No worries - I'm still working on a new track anyhow and then I will have to think about visual design and rough out some concepts. Will be a while before I'm ready to use it for production ;)

Glad you can salvage your work in any case!

wesleycorgi
03-19-2015, 07:12 PM
This may be a dead end, but check out this: http://interialabs.de/lw/lscript/midi2lw.html

roboman
03-19-2015, 09:15 PM
http://www.sharecg.com/v/32424/browse/11/Poser/1890-Melodeon-church-organ

I took a quick look around and couldn't find one that I did, but the above is what I used as a base for the one I did.

stevecullum
03-21-2015, 04:50 AM
Thanks for the links guys, I'll check those out too

cyclopse
01-12-2017, 02:27 AM
*BUMP

I'm looking for a MIDI plugin to control an animation (not for a control surface... making a music video)

jwiede
01-12-2017, 06:33 PM
It shouldn't be that difficult to interface MIDI using LWSDK "HID handler" type of plugin, though I guess it would depend on precisely how you want MIDI events to drive things. Can you describe a bit better how you want specific events to translate into LW actions? As in, how should different devices' note on/offs (+ velocity), aftertouch, pitchwheel, etc., translate to anim channels?

Some mappings are pretty obvious, like pitch_wheel_change mapping to a (normalized) value axis, but not as much so for others. For example, you could interpret notes and their velocity as two distinct value axes (single 2D representation), or each note could select a channel with velocity as the channel value (many 1D representations). Making event mappings "broadly, dynamically reconfigurable" would require a lot of extra UI and dev work in the plugin, so for something quick it'd be good to limit reconfigurability of mappings as much as possible (yet remaining useful).

MonroePoteet
01-12-2017, 07:08 PM
The implementation of a full, general purpose MIDI plug-in is problematic. I wrote a MIDI plug-in for LW many years ago (as a previous, embarrasingly old post in this thread indicates), but it was very carefully tuned to the animation that I was making. As I remember, it basically looked for Note On and Note Off events in the MIDI stream and then I had an "Instrument" plug-in that did anticipation, draw-back, damping, etc. No effort was made to process aftertouch, pitch wheel, etc.

Can you be a little more specific about what you'd like a MIDI plug-in to do?

mTp

jwiede
01-12-2017, 07:43 PM
Monroe, that was before the HID stuff existed though, right? I'm guessing before you had to convert events to channel changes directly?

It should be much easier now, because the HID stuff lets you expose different "fields" the LW user can then arbitrarily map to different animation channels. The only significant issue remaining is how to map events to different fields, per what I described before. LW's HID stuff automatically handles taking streams of field change events and converting them into animation channel changes (and possible time change events as well, I hadn't dug that far).

See HID "clock" example in C/C++ LWSDK for LW2015.3.

MonroePoteet
01-13-2017, 01:09 PM
Yes, it was even pre-V9, so I'll need to convert it to the post-V9 SDK architecture (minor changes). Thanks for the info on the HID class of plugins. My brief perusal of the SDK documentation seems to indicate that class of plugin would be used to interface to a *live* MIDI device (configured through Device Manager), while my plugin reads MDI files which have been previously saved to disk. I'm thinking it'd be a ChannelHandler.

mTp

Spinland
01-13-2017, 06:34 PM
Wow, this is fascinating stuff. As a hobbyist musician I work extensively with MIDI technology and a way to integrate my music into animation that way (as opposed to my current by-hand work flow) would be killer fun. Rock on!

jwiede
01-13-2017, 09:00 PM
My brief perusal of the SDK documentation seems to indicate that class of plugin would be used to interface to a *live* MIDI device (configured through Device Manager), while my plugin reads MDI files which have been previously saved to disk.

Yep, I was expecting to respond to live MIDI events (as device/devices) based on "cyclopse"''s original request, though reading it again I guess it could go either way.

MonroePoteet
01-14-2017, 11:38 AM
I always used saved .mid files, since I "compose" fairly haphazardly on the MIDI workstation and (sadly) hardly ever go back and learn to play the tracks live. As well, setting up the animation is usually a multi-pass situation, so it's nice to have the same playback each time.

mTp

jwiede
01-14-2017, 12:44 PM
I always used saved .mid files, since I "compose" fairly haphazardly on the MIDI workstation and (sadly) hardly ever go back and learn to play the tracks live. As well, setting up the animation is usually a multi-pass situation, so it's nice to have the same playback each time.

Sure, but if computer's playing MIDI file (as in sending events out to bus), there's no reason that same computer/driver can't sit and listen for "live"/playback events as device on same MIDI bus (either as input or pass thru) -- at least, that's the way I was thinking of it. That approach seems to work in both the "live" case AND the "playing events from file" case, without having to differentiate in the receive code between "live" received events, and being handed a .mid file.

MonroePoteet
01-14-2017, 01:24 PM
That's a good idea, but I think would only work for a system that has a MIDI bus installed that's accessible to HID. I tend to compartmentalize my systems, with my MIDI / acoustic system (with MIDI to USB converter) out by the piano / synthesizer / recording setup, while I do LW "in back" on systems which have no MIDI bus accessible to HID. Worth looking into, though.

mTp

jwiede
01-14-2017, 01:40 PM
*BUMP

I'm looking for a MIDI plugin to control an animation (not for a control surface... making a music video)

Can you please clarify whether you want LW to respond to "live" MIDI events, or solely drive animation from a .mid file as input?

jwiede
01-14-2017, 01:58 PM
That's a good idea, but I think would only work for a system that has a MIDI bus installed that's accessible to HID. I tend to compartmentalize my systems, with my MIDI / acoustic system (with MIDI to USB converter) out by the piano / synthesizer / recording setup, while I do LW "in back" on systems which have no MIDI bus accessible to HID. Worth looking into, though.

How is driving Layout animation using a canned file of MIDI events more advantageous/efficient than driving Layout animation channels using a scene file, though? If there's no MIDI interface in the machine, it's not like the MIDI file playback can synchronize with or drive other devices.

BTW, I'm not saying that usage scenario isn't important, I'm saying I don't understand _why_ that usage scenario is important, and would like to understand better why it is.

MonroePoteet
01-14-2017, 04:45 PM
Once the composition is finalized, I record the audio output of the synthesizer to a .wav or .mp3 file, capture the .mid file, and move them both onto the LW system. So, no MIDI sound generators controlled by the .mid file are required on my LW system. My old plugin then used the events in the .mid file to automatically synchronize the motion of the LW objects to the audio soundtrack based on the event stream in the .mid file.

mTp

jwiede
01-14-2017, 06:49 PM
Once the composition is finalized, I record the audio output of the synthesizer to a .wav or .mp3 file, capture the .mid file, and move them both onto the LW system. So, no MIDI sound generators controlled by the .mid file are required on my LW system. My old plugin then used the events in the .mid file to automatically synchronize the motion of the LW objects to the audio soundtrack based on the event stream in the .mid file.

Okay, fair enough. In that case, are you using a distinct device/channel of MIDI to contain separate LW-intended anim events, or are you just triggering anims using the regular audio events?

cyclopse
01-16-2017, 11:33 PM
Can you please clarify whether you want LW to respond to "live" MIDI events, or solely drive animation from a .mid file as input?

I'd like recorded midi tracks. Live is cool, but live compiling isn't practical unless you're using Unity or Unreal (some videogame engine). I'd like it for a music video. (as in having drums kick off on midi on/off information for notes)

lardbros
01-17-2017, 10:54 AM
Not sure if anyone posted this one?

Only 32-bit, but think he's released the source code if anyone wants to re-code and compile??

http://www.tmproductions.com/programs/midichannel/

jwiede
01-17-2017, 11:47 AM
Not sure if anyone posted this one?

Only 32-bit, but think he's released the source code if anyone wants to re-code and compile??

http://www.tmproductions.com/programs/midichannel/

Another good reference, thanks!

stoecklem
01-17-2017, 12:50 PM
It would be great to have a midi plugin and would be much appreciated. Loading a file would be nice, but I think real time is a must. Virtual Studio and midi were meant for each other and I'm not sure why newtek didn't just add it themselves.

It is pretty cool to have virtual lighting controller, puppeteering etc. xsi had this forever and obviously motionbuilder. May not be the most practical way to get things done, but it is fun. and there are a great variety of midi devices out there. ipad app lemur with built in physics comes to mind as an interesting one to play with.

jwiede
01-17-2017, 01:42 PM
It would be great to have a midi plugin and would be much appreciated. Loading a file would be nice, but I think real time is a must. Virtual Studio and midi were meant for each other and I'm not sure why newtek didn't just add it themselves.

Agreed. They were already implementing other HID plugins, it would have been so easy for them to add live MIDI support as well.

tischbein3
01-17-2017, 02:36 PM
Only 32-bit, but think he's released the source code if anyone wants to re-code and compile??


The code depends on some outdated binary stream code, might take a while, have to take a deeper look into this, can't promise anything

lardbros
01-18-2017, 06:13 AM
The code depends on some outdated binary stream code, might take a while, have to take a deeper look into this, can't promise anything

Sounds horrific, but probably easier to code something from scratch maybe, rather than try and figure out what someone else has done before?!

tischbein3
01-18-2017, 08:08 AM
Mhmm, yes/no:
As far as I can see, you have to replace all bifstream read commands with its ifstream counterpart + a endian conversion for each command. the other parts looks more or less ok.

jwiede
01-21-2017, 03:49 PM
Mhmm, yes/no:
As far as I can see, you have to replace all bifstream read commands with its ifstream counterpart + a endian conversion for each command. the other parts looks more or less ok.

So which code are you attempting to build, and to solve which stated problem? Just trying to avoid duplication of effort.

erikals
01-21-2017, 10:26 PM
*BUMP
I'm looking for a MIDI plugin to control an animation (not for a control surface... making a music video)
you got Walen's USB Game Device
file - http://walen.se

info - http://gamedevice.walen.se
never tried it though...

tischbein3
01-22-2017, 01:57 AM
So which code are you attempting to build, and to solve which stated problem? Just trying to avoid duplication of effort.

Well try it, I do have some other stuff wich needs to be done first, so I might need a week before I
can fully concetrate on this.

I'm trying to convert the tmpro plugin:
https://github.com/jangellx/TMProLWPlugIns/tree/master/Workspaces/Lightwave/MIDIChannel

wich relies on this lib:
https://github.com/jangellx/TMProLWPlugIns/tree/master/Projects/Portable/bfstream

Wich does have big endian conversion build-in (needed for midi reading),
but it relies on an outdated fstream.h.
(I do recommend you to download the whole tmproplugin repositry since there are
some other dependencies, most noteable his own about window)

So you either have to find a newer bfstream library / update it, or rewrite all
read operations in the plugin, with an additional endian conversion.
As for the later I do think its the fastest way to do.

hope this helps
chris

MonroePoteet
01-22-2017, 03:55 PM
FYI, here's the webpage on M$'s website describing the changes to the iostream classes which are at issue here:


https://msdn.microsoft.com/en-us/library/aa984818(v=vs.71).aspx

It's truly frustrating how little commitment M$ has to "backward compatibility". Seems like every time I turn around, they're not just *deprecating* functionality (i.e. stop supporting it, but leave it available), but actually *removing* old functionality. It'd be easy if you could just set a flag in Visual Studio saying "use old iostream libraries", but as far as I can tell, that isn't an option. Gosh, thanks.

There's also the article:


https://msdn.microsoft.com/en-us/library/aa984818(v=vs.71).aspx

but it appears that the requires .h files are no longer available in 64-bit Visual Studio 2013, so it's not an option. *MAYBE* this technique would work by copying old <ifstream.h>, <ofstream.h> etc. files from an prior Developer Studio installation, linking to the specified .LIB files, cross your fingers and hope it works!

Also just FYI (no promises), I've abandoned trying to "modernize" my old MIDI plugin (same issues and a few more), have extracted the key code (parsing the MIDI file), and am making reasonably good progress (when I have time) on a fresh implementation of a ChannelHandler.

mTp

jwiede
01-22-2017, 04:57 PM
Well try it, I do have some other stuff wich needs to be done first, so I might need a week before I
can fully concetrate on this.

Actually, the reason I asked is to make sure we're not working on the same thing, and we're not. I'm not trying to update the "MIDI file -> channel handler" plugin approach, y'all are welcome to pursue that target (it appears you and Monroe are, or at least were, both targeting that result). I'm actually looking into producing a MIDI-based HID input plugin that allows live events to drive LW via the "virtual studio" functionality, as in what stoecklem described. I believe that's ultimately a flexible approach, and also more interesting to me because there aren't any of those, while it appears the "MIDI file -> channel handler" approach has multiple existing instances (albeit in need of updating).

If/when my MIDI HID plugin is working, adding the ability to also feed "canned" event streams from MIDI files to HID (as opposed to live event streams) would be, at _best_, a stretch goal for my "v1 target". I'm still investigating feasibility (and my time availability) to see whether I can even take this project on, so I'm nowhere near the point of making any commitments yet (other than that if I do pursue it, my goal will be a HID plugin, not a channel handler). If I do this, the end result will be cross-platform, but that doesn't automatically mean I'll do the Windows version first, either.

Hope that helps makes my intent clearer.

MonroePoteet
01-22-2017, 05:42 PM
OK, thanks for the clarification of your goals / requirements. Have fun!

FYI, one of the basic problems I ran into implementing my old plugin is that physical instruments and phenemona in The Real World "surround" the MIDI stream. For example, in my old plugin, I implemented classes of objects I called "strikers", "sounders", and "dampers". A real-world analogy is a piano: the "striker" is the hammer striking the string, the "sounder" is the string itself, and the "damper" is the felt dampers that damp the sounder.

Because the MIDI event stream only contains the raw events, such as NoteOn, NoteOff, etc., I found it really hard to make any sort of realistic animation with the *Real World* representation of attack, decay, sustain and release phenomena of physical objects in the real world, physical "devices" producing the purported MIDI stream. The "striker" had to anticipate the NoteOn event, and the "damper" had to do the same. The "sounder" had no real information regarding the MIDI voice's ADSR envelope, so it couldn't react very realistically (animation-wise) to the sounds being produced by the VST (or other) sound generator.

Anyway: blah blah blah. Best of luck!

mTp

jwiede
01-22-2017, 06:45 PM
OK, thanks for the clarification of your goals / requirements. Have fun!

FYI, one of the basic problems I ran into implementing my old plugin is that physical instruments and phenemona in The Real World "surround" the MIDI stream. For example, in my old plugin, I implemented classes of objects I called "strikers", "sounders", and "dampers". A real-world analogy is a piano: the "striker" is the hammer striking the string, the "sounder" is the string itself, and the "damper" is the felt dampers that damp the sounder.

Because the MIDI event stream only contains the raw events, such as NoteOn, NoteOff, etc., I found it really hard to make any sort of realistic animation with the *Real World* representation of attack, decay, sustain and release phenomena of physical objects in the real world, physical "devices" producing the purported MIDI stream. The "striker" had to anticipate the NoteOn event, and the "damper" had to do the same. The "sounder" had no real information regarding the MIDI voice's ADSR envelope, so it couldn't react very realistically (animation-wise) to the sounds being produced by the VST (or other) sound generator.


That's actually why I prefer the HID approach: Done properly, it lets the user configure which events and sequences they wish to interpret as triggers, relative and absolute field value inputs (via Virtual Studio tools), rather than having to rely on a hard-coded translation coded into the plugin. All I really need to provide is a few basic translation mappings between MIDI event types and the allowed HID input stream event types, and from there the virtual studio functionality allows the user to then plug those input events into different animation channels, etc. however they desire (in a many-to-many, "plugboard" kind of config).

Such an approach capitalizes on all the existing GUI/UX code in Virtual Studio w.r.t. mapping input events to animation -- by comparison, the MIDI event -> HID event mapping GUI/UX is a _lot_ less complex. Virtual Studio is native LW, so I only have to worry about the HID plugin API contracts (which have been quite stable). Virtual Studio code handles all mapping/interfacing with the LW animation channel APIs (which change more often), and LW3DG is responsible for keeping all that code up to date.

That approach also avoids potentially limiting the user by making presumptions about how they might want to translate HID input events into animation triggers / channel events /etc. They get to do that setup themselves (that's the whole point of the virtual studio infrastructure, after all). I can even envision how to incorporate MIDI file input later on -- instead of taking "live" event input from MIDI, just realtime-feed the "canned" MIDI events in the file as if they're live, without requiring substantial changes to the existing HID plugin code.

I'm not suggesting there's anything wrong with your approach, because there isn't. I just believe now that we have the Virtual Studio infrastructure and APIs available, it makes more sense to "genericize" MIDI as a HID input type just like all the other HID input sources. IMO, that approach offers some long-term LWSDK API contract stability and GUI/UX simplicity benefits over directly converting MIDI events as an animation channel handler plugin. Again, though, that's just my opinion. Having a selection of solutions is ultimately better for users in the long run!

Aside, if it hadn't already occurred to you, that same approach I described for handling both "canned" and "live" MIDI events is as applicable for a channel handler plugin. In your case, you just need an API "seam" at the point where you input MIDI events into your mapping system, and instead of a file reader playing them from file at realtime tempo, you have an alternate input event source which in realtime grabs all "live" MIDI events seen at the hw interface and stuffs them into your mapping engine instead. Of course, that presumes you handle event playback timing prior to mapping in your code -- if that's not the way your code currently works, you'd need to resolve that before adding an abstraction for "canned" and "live" MIDI events.

If you have any questions about the changes I'm describing here, please ask (PM is fine if you'd rather not ask here).

jwiede
01-22-2017, 06:56 PM
Hmm, the more I think about it, in terms of providing MIDI-controlled animation presentation for live performances, etc. it might make more sense to use "canned" or "live" MIDI input to drive a game engine like Unity or Unreal (they probably even already support it, if not they should). OTOH, generating complex CA-type animation in game engines is a lot more effort/complexity, might offset overall efficiency.

As I said, I'm still evaluating whether this overall HID plugin approach makes sense.

MonroePoteet
01-23-2017, 02:13 AM
Hmm, the more I think about it, in terms of providing MIDI-controlled animation presentation for live performances, etc. it might make more sense to use "canned" or "live" MIDI input to drive a game engine like Unity or Unreal (they probably even already support it, if not they should). OTOH, generating complex CA-type animation in game engines is a lot more effort/complexity, might offset overall efficiency.

As I said, I'm still evaluating whether this overall HID plugin approach makes sense.

Funny, because thinking about it tonight I think the HID approach might be really valuable, not for simulating musical instruments, but for using a "universal" MIDI control surface available to control objects and parameters in Virtual Studio in realtime.

For example, here's a page showing a variety of "universal" MIDI control surfaces:

http://www.ebay.com/bhp/midi-control-surface

The knobs, slider and buttons are programmable to send MIDI events, usually via USB. I've never used Virtual Studio, so I'm kind of winging it here, but it seems that using Virtual Studio the HID MIDI interface you're proposing could convert the physical actions of the Scene's director / cameraman / gaffer on the MIDI control surface into LW object / light / camera control. For example, perhaps four of the sliders controls the intensity of a bank of colored Lights, one controls the Camera focal length, a knob controls Camera Depth of Field, another adjusts Camera "jitteryness" for hand-held like motions, etc.

Perusing the Virtual Studio documentation, using these type of MIDI control surfaces seems like a clear analogy to a real control booth in a real studio. So, the MIDI HD plugin seems an obviously valuable extension to Virtual Studio, not for creating effects or motions for the NoteOn and NoteOff events performed by the musician, but for directing the scene and operating the camera, lights and other objects as the scene is playing out, perhaps with even a separate canned MIDI sound track (or not).

mTp

cyclopse
01-23-2017, 02:47 AM
Funny, because thinking about it tonight I think the HID approach might be really valuable, not for simulating musical instruments, but for using a "universal" MIDI control surface available to control objects and parameters in Virtual Studio in realtime.

For example, here's a page showing a variety of "universal" MIDI control surfaces:

http://www.ebay.com/bhp/midi-control-surface

The knobs, slider and buttons are programmable to send MIDI events, usually via USB. I've never used Virtual Studio, so I'm kind of winging it here, but it seems that using Virtual Studio the HID MIDI interface you're proposing could convert the physical actions of the Scene's director / cameraman / gaffer on the MIDI control surface into LW object / light / camera control. For example, perhaps four of the sliders controls the intensity of a bank of colored Lights, one controls the Camera focal length, a knob controls Camera Depth of Field, another adjusts Camera "jitteryness" for hand-held like motions, etc.

Perusing the Virtual Studio documentation, using these type of MIDI control surfaces seems like a clear analogy to a real control booth in a real studio. So, the MIDI HD plugin seems an obviously valuable extension to Virtual Studio, not for creating effects or motions for the NoteOn and NoteOff events performed by the musician, but for directing the scene and operating the camera, lights and other objects as the scene is playing out, perhaps with even a separate canned MIDI sound track (or not).

mTp

Just my point of view: My only interest is in taking a midi file's on-off information and translating that into action. Again... a file... not a device. As far as virtual studio... I'd rather just render every camera angle and edit a multi-camera edit in Avid. Then I can easily adjust it with the client and not have to worry about a re-render from LW.

cyclopse
01-23-2017, 02:50 AM
Hmm, the more I think about it, in terms of providing MIDI-controlled animation presentation for live performances, etc. it might make more sense to use "canned" or "live" MIDI input to drive a game engine like Unity or Unreal (they probably even already support it, if not they should). OTOH, generating complex CA-type animation in game engines is a lot more effort/complexity, might offset overall efficiency.

As I said, I'm still evaluating whether this overall HID plugin approach makes sense.

Agreed on the live 100%. I'm currently trying to make a music video, and let's just say that the only practical use for audio analyze is drums. I'm desperate for something that can parse note on/off as well as what note is being hit.

jwiede
01-23-2017, 11:23 AM
Funny, because thinking about it tonight I think the HID approach might be really valuable, not for simulating musical instruments, but for using a "universal" MIDI control surface available to control objects and parameters in Virtual Studio in realtime.

I agree, UCS is definitely another benefit. I set aside the Unity/Unreal notion quickly, realized there was still more than enough reason to do for LW regardless.

I'm currently figuring out the internal architecture and abstractions, to get a handle on how much work overall will be required. After that, I should be able to give a more solid answer on where I am with this.

cyclopse
02-10-2017, 09:00 AM
Well, obviously there isn't a current one. But... since we can export Nulls from After Effects to Lightwave, I found this neat little tutorial that shows how to use MIDI in After Effects with a free AEX plugin script:

http://www.schoolofmotion.com/midi-controlled-animation-part-1/#