Matt Jones Tech
  • Blog
  • Video Projects
  • Web Projects
  • Using the Cell Fracture Addon to Destroy Statues

    Getting Started

    A few weeks back, I tried my hand at creating desert dunes out of a plane. As part of that same project, this week I’m using the cell fracture addon to destroy some statues. I started off with a couple of characters created in MakeHuman. A couple people on blender.chat mentioned that MakeHuman hadn’t been updated in a while, or wasn’t being actively developed. Regardless, the version I used was super easy to generate a couple of characters to use for statue destruction.

    Exporting to Blender

    Once you’ve got everything how you want it with your character, it’s time to export. Going from MakeHuman to Blender used to be difficult and require a special plugin and weird file extensions. Now, you can just kick out a simple .DAE file and drop it straight into Blender.

    Using the Cell Fracture Addon

    Once in Blender, I posed the characters exactly how I wanted them. Once I got the pose, I applied the armature. In edit mode, I separated the parts of the mesh I wanted to fracture. I didn’t want the whole thing, just bits like the hand and shoulders. Next, I fractured using a small number of pieces, around 50. I had to mess with the level of subdivisions and number of pieces to avoid weirdness in complex areas like hands and fingers.

    Animating vs Simulating

    Typically, I’d simulate the pieces after they’re generated. But for this effect, I wanted a surreal, hyper slow-mo look. For this, I just hand animated the pieces I wanted to break away.

    matt

    July 30, 2019
    3D Animation, 3D Modeling, Blender
    animation, blender 3d, cell fracture, physics, simulation
  • Make It Yours

    This project was, I think, the first completely 3D animation produced as a bump for a sermon series. It was easy enough to source the assets, but the biggest challenge of this project was completing it in a week or two on mid-2010 Mac Pros.

    This project was more of a makeshift exercise in render management. Instead of worrying about how the animations were looking (every shot was just a simple camera move) it was more about commandeering every computer in the office that wasn’t in use in order to hit the two week deadline for delivery. Anyway, this was the project that prompted a pretty good computer upgrade!

    matt

    July 14, 2019
    3D Animation, Video Portfolio
  • Blender Smoke Simulation: Creating Windblown Dunes

    Shoutout to @loranozor for requesting this walkthrough! I don’t do a blender smoke simulation every day, but one of the biggest takeaways that I got from learning my way through this project was the difference between the resolution divisions of the smoke domain and the resolution divisions under the “high resolution” checkbox.

    Smoke Domain Resolution

    Basically, as I understand it, the resolution of the smoke domain defines how many voxels are used in the simulation. The higher the voxel count, the more accurate the main body of smoke. Use domain resolution to shape the main look of your smoke sim. If I’m not mistaken, I believe the little cube in the corner of your smoke domain helps you visualize the size of a single voxel, so you can get rough idea of your simulation scale before you even bake it.

    “High Resolution” Divisions

    Once you’ve got the main shape and behavior of your simulation looking the way you want it, it’s time to enable the “high resolution” checkbox. This is essentially like applying the subsurf modifier to your smoke. It keeps it’s main general shape and behavior, but the high resolution divisions add that extra little bit of “whispiness” for added realism and resolution.

    If you’re interested in learning more about blender smoke simulation, check out Mantaflow. It’s a great branch of blender pushing the boundaries of smoke and fluid sims!

    matt

    July 13, 2019
    3D Animation, 3D Modeling, Blender
    b3d, CGI, dunes, fluid sim, open source, simulation, smoke, smoke simulation
  • Subsurf vs Multires: What’s the Difference?

    My name is Matt and I’ve been using Blender for over 10 years. Today I came to understand the difference between subsurf vs multires. I’d like to share that information with you now.

    Subsurf?

    Subdivision Surface is a modifier that adds virtual geometry to your mesh, giving it a smoother appearance. The extra geometry isn’t there until you apply the modifier. The extra geometry is added evenly, across the entire mesh.

    Multires?

    The multiresolution modifier adds editable virtual geometry to your mesh. The extra geometry is editable in sculpt mode, allowing you to add finer detail to parts of your mesh, leaving other parts untouched. You can step up and down the different levels of resolution, retaining selective detail.

    Best Use Case? Which One Do I Pick?

    Most of the time, I use Subsurf. It’s just a general, quick way to add extra geometry and smooth out your model. Mulitres is best and almost exclusively used for sculpting. Once you get that extra detail in there, you can use that high poly Multires model and bake out a normal map to toss into your material. TLDR;

    Subsurf: general smoothing.

    Multires: specific to sculpting high details and baking later.

    Subsurf vs Multires

    More Blender tips and tricks.

    matt

    June 25, 2019
    3D Animation, 3D Modeling, Blender
    3d modeling, 3d sculpting, b3d, blender3d, workflow
  • VFX Workflow: Start to Finish

    Last week I ran into some problems with a project at work that required me to have just basic knowledge of rigging. After burning over an hour watching and rewatching tutorials, and, because I was under a tight deadline, I got frustrated and eventually just edited the mesh directly to get a basic pose.

    Beginnings

    Now that I’ve hit the deadline and delivered the project, I spent the following weekend to fix my problem once and for all.

    I started with a basic mesh comprised of 7 different primitives.

    Which is fine, for arriving at a character shape fairly quickly, but trash when it actually comes time to rig and animate several primitives as if they were one. So once I got my turtle shape, I joined 2 primitives at a time, and joined up the meshes as best I could. Pair by pair, until I had a turtle mesh and a shell mesh. In hindsight, I would have just joined up the shell and the turtle to get one mesh, but that’s what this whole project was for: learning. Once I had my meshes joined, I marked my seams, and UV unwrapped. Then it was time for texture painting!

    Texture Painting

    Not the best UV unwrap, but it works for this project.

    Rigging

    Once my turtle was painted, it was time to rig. Now that the pressure was off, I had an opportunity to actually learn how to properly built a rig, set up IK constraints, and orient joints using pole targets. Woo!

    Motion Tracking

    Now that I had my rig all finished, I was ready to animate. And my IK constraints made it WAY easier to set a few keyframes and get a halfway decent animation. Next, motion tracking.

    This one actually took me a few hours because I was having to relearn how Blender’s requirements for reconstruction. 1) Blender requires 8 continuous tracking points from the first frame to the last to even have enough data to reconstruct the scene. 2) the average solve error needs to be 0.3 pixels to get an accurate track. My first try resulted in a solve error of 35.6 pixels. So eventually, after learning the requirements, trying some addons, and manually helping it along, I eventually whittled the solve error down to 1.4. Close… Technically usable, but still not the best. In the end, I learned a TON from building this little guy from scratch. So without further ado, I present to you, Shelly.

    Hopefully this helped someone. Thanks for reading!

    matt

    May 7, 2019
    3D Animation, 3D Modeling, Blender, Compositing
    3d animation, b3d, Blender, compositing, motion tracking, pipeline, rigging, vfx, workflow
  • Low Poly Nature Animation

    Earlier this month I was asked to create a video to compliment an Easter graphics package designed around the theme of “Hills and Valleys”. Graphically, the idea was fairly minimal, with a simple blue/purple gradient behind white text. Below are the original mockups.

    Handout Card Design
    Printed Notes Design

    With the white silhouette of a mountain, and a blue/purple gradient, I got started on the accompanying animation. The concept was essentially to imagine what the graphic would look and feel like if you could open it and dive inside. What would the rest of this minimal, silhouette world look like? So over the next 5 days, I streamed over 10 hours and 30 minutes of content on Twitch, documenting my process. Below is the condensed version, sped up 20 times and set to music.

    It took a lot of testing and experimentation to get the final look of the fly-through animation. Here are a handful of different render tests produced while developing the final look of the animation.

    Technical stuff:

    • I did not create the print graphics.
    • I did not create the music (in the timelapse or the final animation).
    • Everything else was created in Blender 2.8 alpha
    • I compiled Blender from scratch twice during the project.
    • I upgraded GPUs twice during the project (Starting with a basic AMD Radeon and ending with dual Nvidia RTX 2080TIs)
    • Streams were handled with OBS.

    matt

    April 19, 2019
    3D Animation, Video Portfolio
  • How To Build Blender With Mantafow

    UPDATE:

    Mantaflow was pushed to Blender’s master branch on December 16, 2019. Now you don’t have to do all this stuff. It’s already in Blender by default! Hooray!

    Today Mantaflow landed in Blender's master branch! #fluids #b3d #simulation pic.twitter.com/CmVduFgVSN

    — Sebastián Barschkis 🌊 (@sebbas) December 16, 2019

    How To Build Blender With Mantaflow

    Here’s what I know about how to build Blender with Mantaflow. Some software engineers may laugh, but I’ve legit been trying to build Blender with Mantaflow for about a solid month (very off and on mind you, but still). If you don’t know what Mantaflow is or why it’s worth building a new copy of blender for… check out http://mantaflow.com/

    The capabilities of this new (in the Blender world, at least) method of calculating fluid and smoke simulations are insane. Unfortunately, it’s not available as a simple plugin, but you can grab it for FREE, if you’re not afraid of a terminal. (Also, I should note, there’s a new way to get FLIP fluids in blender and it IS a plugin, literally called FLIP Fluids, and it’s currently only stable for Windows and will set you back around $76) Regardless, FLIP fluids are highly efficient, and just all-around better than Blender’s default fluid/smoke simulator.

    So how to I do this? Step one: open a terminal. Don’t worry, it doesn’t bite.

    This is a screenshot of htop running in a linux terminal

    What You’ll Need

    First, you’re gonna need some tools before you can start building Blender with Mantaflow. If you’re on a Mac, there’s quite a bit of installing before you can actually start anything. For more info, check out this build guide for Mac OS Once you have Cmake and XCode development tools installed, you’re good to go. For Linux, you’re basically already good to go. Create a new folder in your home folder called blender-git.

    creating a folder named blender-git in the home directory

    Once you’re inside, clone Blender from the source.

    cloning the blender code from the source repository

    Next is the tricky part. Instead of continuing to follow the directions on wiki.blender.org, you’re gonna want to switch branches by running “git checkout fluid-mantaflow”. You should get a confirmation something like “successfully switched branches, you’re now on the fluid-mantaflow branch”

    NOTE: The Blender source code is always changing, so double check developer.blender.org to keep an eye on the latest branch development.

    checking out the mantaflow branch

    Install Dependencies

    Now you need to install/update dependencies for to build Blender with Mantaflow. You can do so by following the step 2 over on wiki.blender.org . All of the tedious installs that you had to do manually, one by one over on Mac can be done with one simple line in the Linux terminal (I’m not biased).

    updating packages, installing build packages

    Then there’s a special shell script that takes care of all your dependencies just by executing it. Simply run:

    running 'install deps.sh'

    Then you’re nearly ready! One last step! Make sure you can compile the code with Cmake by running:

    installing cmake

    Then you just need to make sure you’re inside the blender folder that was created when you cloned blender, and just run ‘make’.

    building blender with cmake

    if that doesn’t work, or if the build fails… try using the tag that sebbas suggests for compiling on Linux:

    building blender with custom build options

    And that’s it! Once the build succeeds, navigate to your build directory, cd into the bin folder and run ./blender

    Tada! Enjoy the awesome power of FLIP fluids for free!

    matt

    April 9, 2019
    3D Animation, 3D Modeling, Graphic Design, Linux, Motion Graphics
  • Hunted

    This is a VFX shot and edit sequence created for a short video in late March 2019. We couldn’t find the ideal location (an open field surrounding a lone tree) for this shot, and I needed too many specifics to match any stock footage. Short on time, I turned to Blender for the solution. All things considered, I’m pretty pleased with the shot. The entire composite took about 8 hours to design and create, plus and overnight render.

    Timelapse of the compositing process
    Capture 1
    Capture 2

    The project was filmed in less than 4 hours on an overcast morning. I tried to get as little of the sky as I possibly could, as I knew I’d be grading day-for-night. Below is the final result:

    matt

    April 7, 2019
    3D Animation, 3D Modeling, Compositing, Video Editing, Video Portfolio
  • What Is The Golem Network?

    What Is The Golem Network?

    Simple explanation: Golem is a bunch of connected computers that team up to become a giant rendering Megazord!

    This is a really cool project. Recently I have found myself lacking computational power in a professional environment. The client loves this Blender animation, but I’ve got 1/13th of the power I need to render what they want by the time they need it. Traditional render farms are out of the question because on demand render pricing would be enough to just buy a render farm outright. Possible solution? Golem network.

    Disclaimer: I have not used Golem in a professional setting. However, that doesn’t mean I’m not very interested in the project. The idea being: you can download a simple client for accessing the network, set up a few ports for your router to forward, and essentially “go online”. There were anywhere between 250-350 computers or “nodes” on the network at any given time (at the time of this writing). You can check this number now at stats.golem.network. It even gives you the collective number of CPU cores, RAM, and disk space available at any given time. Pretty cool!

    There are two sides to Golem. First is the side where you can essentially put your computing resources ‘up for rent’. This allows others on the Golem network to use your computer to render projects. The other side is renting computing resources from the network. Got a huge project to render? You can pay to rent resources to finish your project way faster. A huge advantage is that this is much less expensive than using a traditional render farm. Here’s a promotional video that explains it quite well:

    My Experience

    I ran spent about 2 weeks on the Golem network just renting my unused compute power. Wasn’t sure what to expect. The Golem network is built on the Etherium blockchain and providers are paid in GNT or Golem Network Tokens. If you are buying compute power, you’ll be paying in ETH and also covering any transaction fees. As of now estimating how much you’ll need to render your job is kinda complex. You need to define a specific ‘timeout’ time for your job. So if a weaker node gets your job and takes longer than your ‘timeout’ time, you basically lose your money. In my experience, I rented out my AMD FX 8350 Black Edition and earned about $0.07 worth of GNT. I think it’s because the network is still so new. Even the client to connect to the netowork is still in beta. It could also be that not a lot of people are using the network just because it’s just so new.

    Regardless, the Golem Network is an incredibly cool project to keep an eye on. Who knows, it could potentially be the only way we render our complex projects! It is also worth mentioning that Golem is compatable with Blender projects. I have yet to test out the capabilities of the network and discover what is and isn’t possible when rendering Blender projects with certain versions of Blender, different addons, plugins, etc. Will keep you posted! Thanks for reading.

    matt

    April 2, 2019
    3D Animation, Blender, General Computing, Linux
    b3d, Blender, computing, decentralization, networking, open source, render, render farm, rendering
  • Using 3DConnexion Spacemouse with Linux

    Uh… 3DConnexion Spacemouse Wireless?

    If you’re like me and spend any amount of time in the 3D world, whether it’s for game design, game development, motion graphics, 3D animation, or CAD, you’ve probably felt a bit limited and slow when it comes to navigating the 3D viewport. In some programs, you even have to grab different tools or hold hotkey combos to get the movement you want (zoom, pan, fly, rotate, etc). A company called 3DConnexion has made a fantastic effort to fix all that. For this driver install, I’m using a 3DConnexion Spacemouse Wireless.

    If you’re on a Mac or a Windows machine, it’s as easy as heading over to their site and installing one of their official drivers. It even comes with a little training program to help you get the hang of the basics.

    Unforuntately, 3Dconnextion dropped linux support some time ago. They technically have a Linux driver available on their official site, but it doesn’t work. However, I found the solution:

    Setup

    There is an excellent 3rd party driver available online called SpaceNav and it’s the best thing that’s ever happened.

    Installation is quick and easy. Just download the zipped file with the extension of your choice, extract it wherever you want, and execute the file named ‘configure’, then run ‘make’, and make ‘make install’, and if you want the changes to be permanent and start the driver everytime you boot, then just run ./setup-init. All these instructions are in a handy file named README! After a reboot, Blender should be up and running with your 3DConnexion Spacemouse.

    Inside Blender

    Once you’re in Blender, you can hit one of the shoulder buttons (the long, skinny buttons on the side) and it will bring up a settings menu for your 3D mouse. From there, you can tweak everything to your liking, including naviation speed, inverting axes, and include some a navigation guide when you fly around.

    That’s it! All done. Enjoy flying around the 3D world with your fancy spacemouse!

    Side Note:

    This is currently not working with the daily build of Blender 2.8 as of March 25, 2019. It works fine on 2.79, though.

    matt

    March 26, 2019
    3D Animation, 3D Modeling, Linux, Ubuntu
    3d animation, 3d modeling, 3dconnexion, b3d, Blender, blender3d, CAD, space mouse
Previous Page
1 2 3
Next Page

Prove all things; hold fast that which is good. 1 Thess 5:21