RESUMO
Creating 3D shapes from 2D drawings is an important problem with applications in content creation for computer animation and virtual reality. We introduce a new sketch-based system, CreatureShop, that enables amateurs to create high-quality textured 3D character models from 2D drawings with ease and efficiency. CreatureShop takes an input bitmap drawing of a character (such as an animal or other creature), depicted from an arbitrary descriptive pose and viewpoint, and creates a 3D shape with plausible geometric details and textures from a small number of user annotations on the 2D drawing. Our key contributions are a novel oblique view modeling method, a set of systematic approaches for producing plausible textures on the invisible or occluded parts of the 3D character (as viewed from the direction of the input drawing), and a user-friendly interactive system. We validate our system and methods by creating numerous 3D characters from various drawings, and compare our results with related works to show the advantages of our method. We perform a user study to evaluate the usability of our system, which demonstrates that our system is a practical and efficient approach to create fully-textured 3D character models for novice users.
RESUMO
This paper presents a novel algorithm to generate micrography QR codes, a novel machine-readable graphic generated by embedding a QR code within a micrography image. The unique structure of micrography makes it incompatible with existing methods used to combine QR codes with natural or halftone images. We exploited the high-frequency nature of micrography in the design of a novel deformation model that enables the skillful warping of individual letters and adjustment of font weights to enable the embedding of a QR code within a micrography. The entire process is supervised by a set of visual quality metrics tailored specifically for micrography, in conjunction with a novel QR code quality measure aimed at striking a balance between visual fidelity and decoding robustness. The proposed QR code quality measure is based on probabilistic models learned from decoding experiments using popular decoders with synthetic QR codes to capture the various forms of distortion that result from image embedding. Experiment results demonstrate the efficacy of the proposed method in generating micrography QR codes of high quality from a wide variety of inputs. The ability to embed QR codes with multiple scales makes it possible to produce a wide range of diverse designs. Experiments and user studies were conducted to evaluate the proposed method from a qualitative as well as quantitative perspective.
RESUMO
Our homes and workspaces are filled with collections of dozens of artifacts laid out on surfaces such as shelves, counters, and mantles. The content and layout of these arrangements reflect both context, e.g., kitchen or living room, and style, e.g., neat or messy. Manually assembling such arrangements in virtual scenes is highly time consuming, especially when one needs to generate multiple diverse arrangements for numerous support surfaces and living spaces. We present a data-driven method especially designed for artifact arrangement which automatically populates empty surfaces with diverse believable arrangements of artifacts in a given style. The input to our method is an annotated photograph or a 3D model of an exemplar arrangement, that reflects the desired context and style. Our method leverages this exemplar to generate diverse arrangements reflecting the exemplar style for arbitrary furniture setups and layout dimensions. To simultaneously achieve scalability, diversity and style preservation, we define a valid solution space of arrangements that reflect the input style. We obtain solutions within this space using barrier functions and stochastic optimization.