At Adobe MAX 2011, Adobe showed off some really interesting stuff that the upcoming versions of their software will be able to do including video meshes, an entirely new way to edit videos, and synchronized crowdsource video.

Adobe Video Meshes

This first video is about video meshes and shows what can be done including the ability to create 3D fly-throughs of 2D videos and change focus and depth of field. I thought it was pretty cool and could really bring some interesting things to video. Basically, the camera can be stationary but the software can create a sort of 3D fly through which is rad!

In this video, Sylvain Paris will show you a sneak peak of a potential feature for editing videos, including the ability to create 3D fly-throughs of 2D videos and change focus and depth of field.

In just the few minutes of demo he did some stuff in post-processing that normally needs to be done at time of shooting and could require multiple takes just to get it right. The on-the-fly ability to change focus point and the ability to extrapolate from the flat 2D image into a sort of 3D is quite cool as well

Crowdsourced Video Syncing

Another awesome thing they showed off was crowdsourced video syncing and editing in Premiere Pro.

In this video, Nicholas Bryan shows you a sneak peek of a potential new feature for video editing software that synchronizes video clips taken with different cameras and different vantage points into a single, immersive video.

What you get is a multi-camera view of a single event which could certainly make for some very creative new videos. Imagine if you are a photographer for a wedding and everyone has a video camera, you could then, as part of the package for the client, offer a video collage or montage of the whole thing as well. What about music videos? Could could align all the videos that were taken by fans and mix them into an entirely new live music video to help get the word out about the band, etc. New reporting could be phenomenal as it would be crowdsourced video news of real events and with all of these new tools you could gather up all the footage in somewhat real time and get a grassroots look at how an event went down.

There were some other sneaks that weren’t really video related but still cool so here’s the full list (sans the two above of course).

  • Image Deblurring – removing blurriness from digital photos caused by camera shake while the pictures were being taken;
  • Local Layer Ordering – a new way for graphic designers to create layered compositions that better reflect the way real world objects act;
  • InDesign Liquid Layout – using InDesign to create high quality magazines that automatically adapt layouts across devices and screen orientation;
  • Near Field Communications in Adobe AIR – using Adobe AIR to create applications that communicate with the physical world;
  • Reverse Debugging in Flash Builder – the ability to step backwards in time while debugging a Flash application to better find the root cause of bugs;
  • RubbaDub – automatically replacing the dialog of a video clip with separately recorded audio with perfect synchronization;
  • Pixel Nuggets – searching through a large library of images by identifying images that contain the same people, backgrounds, landmarks, etc.;
  • Monocle – a new visual tool to help developers find and fix performance problems in Flash applications;
  • GPU Parallelism – using a device’s graphic processing unit (GPU) to accelerate performance of general purpose computing.