Back To Listings RSS Print

Twitter find: The future of video editing?

Super LoiLoScope is a unique NLE interface

By Scott Simmons | June 19, 2009

Twitter user renatoghio posted a tweet earlier today with a link. He asked this simple question: The future of video editing?

When I first watched this video of Super LoiLoScope in action I thought ... woah! That's crazy. The creators of this unique editing product look to be taking something that today has an established paradigm, video editing and its timeline based way of working, and turning it on its ear. Most all editors we use today utilize the clips in a bin, preview and edit monitor, timeline way of working. This was established long ago and has served the video editing industry well for many year now but it's nice to see someone thinking way outside the box when creating a new editor.

Just one look at the LoiLoScope website and you know it's Japanese. The company info says it was eatablished in 2007 by an engineer and an art student and is "dedicated in creating simple and fun to use software making everyone's creativity possible. We are good at processing super high-speed movie processing and develop intuitive user interface."

From the company's press room:

Super LoiLoScope also pioneers a brand new user interface that improves on other existing video editing software. With its unique ‘infinite desktop' users can freely drag & drop video clips onto their workspace. By cutting and copying movie clips with a simple touch and arranging them on a timeline, even first timers can easily master the software. With Super LoiLoScope, users can also have more than one infinite desktop running concurrently which allows for many different projects to be open at the same time. Other new features include HD resolution support, a new text feature, and more device export variations.


An intuitive user interface is what really struck me when watching the video. All I could think about was Tom Cruise and the gesture based computing he was doing in Minority Report ... only Super LoiLoScope is doing this with a mouse. That's the one thing that I think would totally hinder an application like this in a professional editing environment: its reliance on a mouse to do so much of the editing, trimming and effects work. In fact it looks from the video like the majority of the features are very mouse dependent. But if you think about using something like this with a graphics tablet and pen then it becomes a bit more exciting. I've often watched Flame artists use a graphics tablet with envy. But when I try to use one for everyday Avid and Final Cut Pro editing tasks it seems to really slow me down since the interfaces really aren't built for one. But just looking at how the user is zooming, scrolling and tossing clips around in Super LoiLoScope it seems like it could really benefit from a graphics tablet and get an almost tactile feel from using one.

One thing that I really like that I would love to see implemented in pro-NLE software is this idea of zooming in and out:



Can you imaging having your thumbnail view selected in an Avid or Final Cut Pro bin and being able to have that kind of control over the thumbs? Moving and scaling and playing the clips with that kind of ease would be a great addition indeed.

There's more cool looking stuff where that came from. Check out the description page for more on this unique editing application. I just wish it was available for Macintosh!
Editor's Choice
PVC Exclusive
From our Sponsors

Share This

Back To Listings RSS Print

Get articles like this in your inbox: Sign Up

Comments

Chris Hocking: | June, 19, 2009

Obviously it’s not the same, but you can always hold down the “Control” button on a Mac and use the Mac’s track wheel to zoom in to thumbnails in your NLE.

I think this kind of technology is really cool - however, imagine how much more amazing it would be on a top-of-the-line MacPro tower running Snow Leopard. With all that additional graphics power built into the infrastructure and underlying core system, you could do some really amazing things!

I think eventually iMovie will move into this territory.

Zak Ray: | June, 19, 2009

Agreed, this is probably the future of iMovie if anything. While I do think NLEs are in for a major shift in the future, I think it’ll be based on metadata and ScriptSync-like features, along the lines of what Adobe is doing. It has to prove more EFFICIENT than the old ways, not just more cool.

Personally, I just want a system that lets me be creative and flowing without dealing with all the technical stuff unless I need to. Simple, minimalist—something where my hands almost never leave the keyboard, and my eyes almost never leave the canvas.

Scott Simmons: | June, 19, 2009

Great comment Zak. I always want more efficient editing. While the iMovie thumbnail scrolling thing is flashy and cool, now that I’ve played with the new iMovie I see how totally inefficient as it requires so much mouse interaction. I want someone to think this way in a real pro NLE app!

Jared Scheib: | June, 20, 2009

While I’m a 5-years-and-running convert to Mac and FCP myself, Super LoiLoScope combined with Microsoft Surface (http://www.microsoft.com/surface/) or a future derivative certainly seems rife with opportunity and efficiency. Forget mouse, you say? Forget keyboard, too. At least, in the conventional sense. Very cool post - thanks!

Jared Scheib: | June, 20, 2009

I spoke too soon: http://www.youtube.com/watch?v=oMXkfLZmJgA

AndrewK: | June, 22, 2009

For basic, consumer level editing I think this will be the future in some form but for professional level editing mouse or touch (i.e. Minority Report) interfaces are just too limiting, IMO.  It’s kinda like motion controls for video games.  The Wii controls are great for relatively simple games like tennis or golf but try playing a game that requires more precision and is more complex, like a fighting game, and you’ll want to punt the Wii out a window in short order.

Jared Scheib: | June, 22, 2009

Though as with any nascent technology, such as the Wii or other motion control, don’t cook your chickens before they’ve hatched (did I just make up that expression?). Where those technologies are now versus where they’ll be when they’ve got 10 years under their belts, like FCP does now for instance, as opposed to only 1 or 2 years, will be a world of difference. I think we’ll be seeing some very amazing changes in the way we edit on the professional level, so I hope LoiloScope continues to evolve. Once it (or something else similar to it) really gets off the ground, in addition to complementary and commensurate motion control, it should probably prove to be pretty unreal.

Zak Ray: | June, 22, 2009

I agree the potential is certainly greater than the current possibilities, but there are some things I just don’t see changing. Say I want to modify the speed—in FCP, I hit command-J and I’m there in a split second. There are over 400 shortcuts in FCP. Can you think of 400 hand gestures to signify them all?

scott erickson: | June, 22, 2009

I remember a similar desktop layout feature inside of Pinnacle Liquid, pre-Avid days. You could grab clips and lay them out on a desktop-like space and play right from there, as a sort of quick storyboard like environment. Not sure what happened to that feature as we switched the FCP at that time anyway but im curious if it ever made its way into an Avid product. Anyway, neat idea this, neat idea.

AndrewK: | June, 22, 2009

Maybe I’m thinking too inside the box but like Zak I look at the hundreds of keyboard short cuts I can execute quickly and wonder how that will translate into a primarily motion/gesture based system.  I think gestures will augment the KB/M for basic tasks but I don’t see gesture becoming the sole or primary interface method.  Simple isn’t usually better when you need to do a variety of complex tasks.

If you haven’t heard of Microsoft’s Natal device you should definitely Google it.  It’s a future peripheral for the Xbox 360 that houses a 2D camera, a 3D camera, and a mic.  It’s basically MS’s attempt to one up the Wii controls.  It has voice recognition and complete face & body recognition (it tracks 48 points on your body) so basically your body is the controller.  They only had a prototype at E3 but it looks very cool.  With that being said, even if it works perfectly it will only be good for tasks that use relatively simple controls.  It will be great for casual games, but for games w/more complex controls like fighting games or first person shooters it won’t hold up, IMO.

For example, in a typical first person shooter, you’ll need to be able to walk up/back, strafe left/right, look all around, jump, duck, fire your weapon, reload your weapon, switch between weapons, pick up new weapons, switch between firing modes/types on your weapon, zoom in/out on weapons that have scopes, toss grenades, switch between types of grenades, perform melee attacks, and interact w/the environment in misc ways (picking up items, opening doors, healing teammates, etc.,).  That’s a lot of stuff to accomplish w/just motion controls.  Even if you can do it all w/motion controls will it be better, faster, and less fatiguing than doing it w/a regular controller (can you tell I work w/games at my day job wink)?

I think I might have been the only person to see Minority Report and think, “That looks cool, but what a horribly inefficient way to work.”

Oh, one last plug, I’m reading “Droidmaker:  George Lucas and the Digital Revolution” and I recommend it very much to anyone that wants to learn a pretty detailed history of how computers made it into the world of production and post.

Scott Simmons: | June, 22, 2009

scott, you can play back a thumbnail when viewing as a thumbnail in an Avid bin. It’s a nice way to audition clips that’s for sure.

Jared Scheib: | June, 22, 2009

I agree with both of y’all about hand/motion gestures not nearly providing for the hundreds of keyboard shortcuts, but in my original post for instance I didn’t exempt using a keyboard, like on the Microsoft Surface where I believe it’s a virtual touch keyboard.

Though I don’t know what the tech will look like in the future, I imagine some things are going to be more organic and more efficient through motion control, others will be more efficient through keyboard shortcuts and the like, and yet other things will be more efficient through voice control or other technologies. So, a hybrid of all of these things will likely ultimately provide the most versatile and viable interface, I think. I don’t use the mouse much at all in FCP anymore, but I still feel frustrated about certain functions via keyboard (though for the most part it does what I need very well) and wish I could just physically grab the timeline and do it by hand, as in motion gestures. I agree with you AndrewK - a combination/augmentation would be best, though I don’t know what the appropriate proportions are. Maybe one day the individual user will be able to configure/decide for him/herself.

Zak Ray: | June, 28, 2009

http://news.bbc.co.uk/2/hi/science/nature/8112873.stm

I wonder if this includes the keyboard….

Please login or register to comment