Difference between revisions of "Projects:Sketchy recognition"

From DiVersions
Jump to navigation Jump to search
((Re)sources)
(Nicolas Malevé, Michael Murtaugh)
Line 1: Line 1:
 +
__NOTOC__
 
__NOTOC__
 
__NOTOC__
 
== Nicolas Malevé, Michael Murtaugh ==
 
== Nicolas Malevé, Michael Murtaugh ==
Line 29: Line 30:
 
* [http://sicv.activearchives.org/logbook/you-were-asked-to-draw-an-angel/ You were asked to draw an angel] Working notes from the Scandinavian Institute for Computational Vandalism (April 2017)
 
* [http://sicv.activearchives.org/logbook/you-were-asked-to-draw-an-angel/ You were asked to draw an angel] Working notes from the Scandinavian Institute for Computational Vandalism (April 2017)
 
* [http://sicv.activearchives.org/logbook/assisted-drawing/ Assisted drawing], Working notes from the Scandinavian Institute for Computational Vandalism (January 2016) + [https://medium.com/@samim/assisted-drawing-7b26c81daf2d#.2d1ju3lnr Assisted drawing: Exploring Augmented Creativity], original blogpost by Samim (December 2015)
 
* [http://sicv.activearchives.org/logbook/assisted-drawing/ Assisted drawing], Working notes from the Scandinavian Institute for Computational Vandalism (January 2016) + [https://medium.com/@samim/assisted-drawing-7b26c81daf2d#.2d1ju3lnr Assisted drawing: Exploring Augmented Creativity], original blogpost by Samim (December 2015)
* [http://cybertron.cg.tu-berlin.de/eitz/projects/classifysketch/ How Do Humans Sketch Objects?], Mathias Eitz, James Hays and Marc Alexa (2012)
+
* [http://cybertron.cg.tu-berlin.de/eitz/projects/classifysketch/ How Do Humans Sketch Objects?] and [https://github.com/GTmac/Classify-Human-Sketches C/C++ implementation], Mathias Eitz, James Hays and Marc Alexa (2012)
 +
* Python/Jupyter https://github.com/ajwadjaved/Sketch-Recognizer
 +
* [https://github.com/jalayrac/sketch-recognizer Jean-Baptist Alayrac's working python code] (what we ended up using)
  
==== Code ====
+
Collection: ''[http://www.mim.be/en Musical Instruments Museum (MIM)]''
  
* C/C++ implementation for the paper "How do Human Sketch Objects?" https://github.com/GTmac/Classify-Human-Sketches
+
=== Reconnaissance esquissé  ===
* Python/Jupyter https://github.com/ajwadjaved/Sketch-Recognizer
+
 
* [https://github.com/jalayrac/sketch-recognizer Jean-Baptist Alayrac's working python code] (what we ended up using)
+
[translation FR]
  
Collections: ''Koninklijke Musea voor Kunst en Geschiedenis/MIM (Carmentis)''
+
=== Schetsmatige herkenning ===
  
 +
[translation NL]
  
 
[[Sketchy sketches|Working sketches + notes]] (not in publication v1)
 
[[Sketchy sketches|Working sketches + notes]] (not in publication v1)
 +
 +
=== References / References / Referenties ===
  
 
[[category:translate]]
 
[[category:translate]]
 
=== References / References / Referenties ===
 

Revision as of 05:50, 10 September 2019


Nicolas Malevé, Michael Murtaugh

Page 1 Page 2 Page 3 Page 4

Sketchy recognition

Bread, Nose, Kangaroo or Teddy Bear?

A photograph from the collection of the Museum of Musical Instrument is processed by a contour detector algorithm. The algorithm draws the lines it found on the image sequentially. While it is tracing the contours, another algorithm, a sketch detector, tries to guess what is being drawn. Is it bread? A kangaroo? It is a teddy bear.

Sketchy Recognition (working title) is an attempt to provoke a dialogue with, and between, algorithms, visitors and museum collections.

Cast:

  • Musical instruments: MIM collection, Brussels.
  • Line detector: The Hough algorithm in the OpenCV toolbox, originally developed to analyse bubble chamber photographs.
  • Sketch recognizer: an algorithm based on the research of Eitz, Hays and Alexa (2012), and the code and models by Jean-Baptiste Alayrac.
  • Data: from the hands of the many volunteers who contributed to Google's Quick, Draw! Dataset.
  • Special sauce, bugs and fixes: Michael and Nicolas

(Re)sources

Humans have used sketching to depict our visual world since prehistoric times. Even today, sketching is possibly the only rendering technique readily available to all humans. This paper is the first large scale exploration of human sketches. We analyze the distribution of non-expert sketches of everyday objects such as 'teapot' or 'car'. We ask humans to sketch objects of a given category and gather 20,000 unique sketches evenly distributed over 250 object categories. With this dataset we perform a perceptual study and find that humans can correctly identify the object category of a sketch 73% of the time. We compare human performance against computational recognition methods. We develop a bag-of-features sketch representation and use multi-class support vector machines, trained on our sketch dataset, to classify sketches. The resulting recognition method is able to identify unknown sketches with 56% accuracy (chance is 0.4%). Based on the computational model, we demonstrate an interactive sketch recognition system. We release the complete crowd-sourced dataset of sketches to the community.[1]

Collection: Musical Instruments Museum (MIM)

Reconnaissance esquissé

[translation FR]

Schetsmatige herkenning

[translation NL]

Working sketches + notes (not in publication v1)

References / References / Referenties