Tuesday, March 18, 2014

From Paper to Plastic

Edit: This project has been featured on 3DPrinterWorld.com!  If you're interested in more information, you can contact me at dylanhrush at gmail.

Some time ago I had an idea for a tablet application in which users could draw two dimensional representations of objects, and they would come to life in 3D.  At the basic level, users would be interacting with a virtual ball of clay, or slab of marble, and various transformations could be made: users could cut sections away, drill holes, or make indentations.  This was to be a personal, experimental project, just to see what I could create.  What I made ended up looking like a rudimentary version of Autodesk Meshmixer.  At first I started writing this using polygons to represent the virtual matter, but it was way too difficult.  The triangulation of polygons for drawing and intersection tests turns out to be a computationally hard problem, and splitting solid polygons across planes was not an easy task.  I ended up avoiding polygons altogether; now everything is based on Octrees. I named my app Marble, as using it feels more like marble than clay.

Anyway, I'm not going to divulge much more about Marble in this post.  Instead I'd like to share an interesting related project.  I realized that tablet input is not the only way that users can draw on the 2D plane.  What if you could draw an outline of some object on paper, and have it come to life right before your eyes?  I wanted to build a machine where you feed in your sketches on paper, and the device creates those objects for you.  That's exactly what I did.

The current process is very hands on.  My vision is to automate this process to the point where you put some paper on the scanner bed, press a button and your object is generated before you.  I don't believe this has any realistic industrial applications.  I just think it's a cool project.

For the maiden voyage I drew a very simple sketch of the Eiffel Tower.  The input to this system takes at least three images: a front, side and top view:


The hardware: a Printrbot Simple, a Canoscan 4200F (the cheapest scanner I could find,) and my laptop.  Installing drivers for the scanner was arguably the most difficult part of this project.


I made monochrome images of the scan and imported them into the Marble application. (I said this was a tablet app, but it runs on desktops as well.)  Marble cuts the slab along the outlines of the drawings, and discards what is not filled in.  It does this three times, rotating the slab between cuts.  I chose to use just three cuts for simplicity, but the software can cut the slab against any number of angles.

You might notice that the model is a bit jagged.  My old laptop could not handle a higher resolution without slowing down significantly.



I imported the Marble model into Blender.  As I wrote the app myself, I had to create my own file format for Marble models.  I wrote a Blender python script to read this file format and generate a corresponding mesh.

Once the mesh is in Blender, it's simple to generate an STL and get printing!

The final product: