Next Generation Product Development Tools, Part 6

In the previous installment I focused on tools that help designers create virtual models. In this entry I want to focus on other ways to generate virtual models (in all their varying glory).

The word “sampling” is probably most often associated with music, but it’s not at all limited to that application. Physical models are sometimes sculpted and their shape digitally sampled, or a previously existing reference might be digitized and used as a scaffold for building a new, virtual model. Or something entirely unrelated can be sampled and turned into a virtual 3D model {Note: had to add this excellent example: “spam architecture” – Link}. Once digitized, there’s not much that can’t be done with digitally sampled information.

The Digital Component of Meatspace – Physical Sampling

For a while a relatively common way for manufacturing or entertainment developers to digitize a physical object was to go through the painstaking process of literally touching it with the business end of a scanning arm, as in this old Microscribe promo video:

Today an increasingly common method of acquiring three dimensional information is to scan an object with reflected laser beams. Some very few of you might remember my mentions of the low-cost NextEngine 3D scanner. There’s a few videos out (like this one – Link), but here’s a brand new video from Mahalo Daily:

(As an aside, for the Second Life people, if anyone knows who’s supposedly using a 3D scanner in conjunction with SL, let us know.)

CT scans might not seem relevant to the broader discussion, but consider this video showing some of the latest medical science technology in action:

Now compare the previous medical scans with the following architectural preservation application:

Or with some images (Link) and videos from geologic information systems (GIS), such as this one:

In many ways they don’t seem very different. More importantly for this discussion: if it’s 3D, the digital data is more similar than not.

Finally, most people probably remember an earlier entry I posted that included an unofficially released video by FrontDesign (since removed). Here’s the official version:

Be aware that all we’re really watching in the above video is a different kind of digital input: the person’s hand movement is being sampled.

In the FrontDesign example, if I’m not mistaken, a person is physically holding a marker which is being tracked such that its positional history can be turned into a 3D path. That path is used to create a “sweep”, a 2D cross-sectional shape (e.g. circle, square, rectangle, whatever) traced along a path, thus creating a 3D volume (video example here: Link).

Imagine now that a person were honking a horn in their other hand at the same time they were using the marker to “sketch” the shape in the air; perhaps honking at key moments. Sound waves propagating from the horn at those times could also be sampled. That raises some interesting possibilities. Hold that thought for later.

2 thoughts on “Next Generation Product Development Tools, Part 6

  1. Pingback: Rhythmeering » Blog Archive » Manufacturing 3.0

  2. Pingback: reBang weblog

Comments are closed.