{"id":226,"date":"2013-04-16T11:25:51","date_gmt":"2013-04-16T15:25:51","guid":{"rendered":"https:\/\/peterabeles.com\/blog\/?p=226"},"modified":"2013-04-16T11:25:51","modified_gmt":"2013-04-16T15:25:51","slug":"kinect-in-java","status":"publish","type":"post","link":"https:\/\/peterabeles.com\/blog\/?p=226","title":{"rendered":"Kinect in Java"},"content":{"rendered":"<p>The folks at <a href=\"http:\/\/openkinect.org\">OpenKinect<\/a> did a good job at providing JNI wrappers for their Kinect driver.\u00a0 What they didn&#8217;t do is provide a nice example showing how to process the byte stream data from the RGB and depth cameras.\u00a0 As far as I can tell, even the comments in their c-code doesn&#8217;t fully describe the data format.\u00a0 The OpenKinect wiki provides a bit more information but tends to be out of date in places and still didn&#8217;t fully describe the data format.\u00a0 So off to google to find a working example in Java.\u00a0 All I could find was a bunch of stuff telling people to use the <a href=\"http:\/\/www.shiffman.net\/p5\/kinect\/\">Processing OpenKinect<\/a> code.\u00a0 That&#8217;s of little use to me, but I did find browsing their source code useful.<\/p>\n<p>So for those of you who just want a straight forward example demonstrating how use OpenKinect in Java, you&#8217;ve come to the right place.\u00a0 The code below does use BoofCV for some of the image processing and display, but that&#8217;s not essential and could be easily modified to not use BoofCV.\u00a0<\/p>\n<p><em>OpenKinect Version:<\/em> 0.1.2<br \/>\n<em>BoofCV Version:<\/em> 0.14<\/p>\n<pre class=\"brush: php; title: ; notranslate\" title=\"\">\/**\r\n * Example demonstrating how to process and display data from the Kinect.\r\n *\r\n * @author Peter Abeles\r\n *\/\r\npublic class OpenKinectStreamingTest {\r\n\r\n\t{\r\n\t\t\/\/ Modify this link to be where you store your shared library\r\n\t\tNativeLibrary.addSearchPath(&quot;freenect&quot;, &quot;\/home\/pja\/libfreenect\/build\/lib&quot;);\r\n\t}\r\n\r\n\tMultiSpectral&lt;ImageUInt8&gt; rgb = new MultiSpectral&lt;ImageUInt8&gt;(ImageUInt8.class,1,1,3);\r\n\tImageUInt16 depth = new ImageUInt16(1,1);\r\n\r\n\tBufferedImage outRgb;\r\n\tImagePanel guiRgb;\r\n\r\n\tBufferedImage outDepth;\r\n\tImagePanel guiDepth;\r\n\r\n\tpublic void process() {\r\n\t\tContext kinect = Freenect.createContext();\r\n\r\n\t\tif( kinect.numDevices() &lt; 0 )\r\n\t\t\tthrow new RuntimeException(&quot;No kinect found!&quot;);\r\n\r\n\t\tDevice device = kinect.openDevice(0);\r\n\r\n\t\tdevice.setDepthFormat(DepthFormat.REGISTERED);\r\n\r\n\t\tdevice.setVideoFormat(VideoFormat.RGB);\r\n\r\n\t\tdevice.startDepth(new DepthHandler() {\r\n\t\t\t@Override\r\n\t\t\tpublic void onFrameReceived(FrameMode mode, ByteBuffer frame, int timestamp) {\r\n\t\t\t\tprocessDepth(mode,frame,timestamp);\r\n\t\t\t}\r\n\t\t});\r\n\t\tdevice.startVideo(new VideoHandler() {\r\n\t\t\t@Override\r\n\t\t\tpublic void onFrameReceived(FrameMode mode, ByteBuffer frame, int timestamp) {\r\n\t\t\t\tprocessRgb(mode,frame,timestamp);\r\n\t\t\t}\r\n\t\t});\r\n\r\n\t\tlong starTime = System.currentTimeMillis();\r\n\t\twhile( starTime+100000 &gt; System.currentTimeMillis() ) {}\r\n\t\tSystem.out.println(&quot;100 Seconds elapsed&quot;);\r\n\r\n\t\tdevice.stopDepth();\r\n\t\tdevice.stopVideo();\r\n\t\tdevice.close();\r\n\r\n\t}\r\n\r\n\tprotected void processDepth( FrameMode mode, ByteBuffer frame, int timestamp ) {\r\n\t\tSystem.out.println(&quot;Got depth! &quot;+timestamp);\r\n\r\n\t\tif( outDepth == null ) {\r\n\t\t\tdepth.reshape(mode.getWidth(),mode.getHeight());\r\n\t\t\toutDepth = new BufferedImage(depth.width,depth.height,BufferedImage.TYPE_INT_BGR);\r\n\t\t\tguiDepth = ShowImages.showWindow(outDepth,&quot;Depth Image&quot;);\r\n\t\t}\r\n\r\n\t\tint indexIn = 0;\r\n\t\tfor( int y = 0; y &lt; rgb.height; y++ ) {\r\n\t\t\tint indexOut = rgb.startIndex + y*rgb.stride;\r\n\t\t\tfor( int x = 0; x &lt; rgb.width; x++ , indexOut++ ) {\r\n\t\t\t\tdepth.data&#x5B;indexOut] = (short)((frame.get(indexIn++) &amp; 0xFF) | ((frame.get(indexIn++) &amp; 0xFF) &lt;&lt; 8 ));\r\n\t\t\t}\r\n\t\t}\r\n\r\n\t\tVisualizeImageData.grayUnsigned(depth,outDepth,1000);\r\n\t\tguiDepth.repaint();\r\n\t}\r\n\r\n\tprotected void processRgb( FrameMode mode, ByteBuffer frame, int timestamp ) {\r\n\t\tif( mode.getVideoFormat() != VideoFormat.RGB ) {\r\n\t\t\tSystem.out.println(&quot;Bad rgb format!&quot;);\r\n\t\t}\r\n\r\n\t\tSystem.out.println(&quot;Got rgb! &quot;+timestamp);\r\n\r\n\t\tif( outRgb == null ) {\r\n\t\t\trgb.reshape(mode.getWidth(),mode.getHeight());\r\n\t\t\toutRgb = new BufferedImage(rgb.width,rgb.height,BufferedImage.TYPE_INT_BGR);\r\n\t\t\tguiRgb = ShowImages.showWindow(outRgb,&quot;RGB Image&quot;);\r\n\t\t}\r\n\r\n\t\tImageUInt8 band0 = rgb.getBand(0);\r\n\t\tImageUInt8 band1 = rgb.getBand(1);\r\n\t\tImageUInt8 band2 = rgb.getBand(2);\r\n\r\n\t\tint indexIn = 0;\r\n\t\tfor( int y = 0; y &lt; rgb.height; y++ ) {\r\n\t\t\tint indexOut = rgb.startIndex + y*rgb.stride;\r\n\t\t\tfor( int x = 0; x &lt; rgb.width; x++ , indexOut++ ) {\r\n\t\t\t\tband2.data&#x5B;indexOut] = frame.get(indexIn++);\r\n\t\t\t\tband1.data&#x5B;indexOut] = frame.get(indexIn++);\r\n\t\t\t\tband0.data&#x5B;indexOut] = frame.get(indexIn++);\r\n\t\t\t}\r\n\t\t}\r\n\r\n\t\tConvertBufferedImage.convertTo_U8(rgb,outRgb);\r\n\t\tguiRgb.repaint();\r\n\t}\r\n\r\n\tpublic static void main( String args&#x5B;] ) {\r\n\t\tOpenKinectStreamingTest app = new OpenKinectStreamingTest();\r\n\r\n\t\tapp.process();\r\n\t}\r\n}\r\n<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>The folks at OpenKinect did a good job at providing JNI wrappers for their Kinect driver.\u00a0 What they didn&#8217;t do is provide a nice example showing how to process the byte stream data from the RGB and depth cameras.\u00a0 As far as I can tell, even the comments in their c-code doesn&#8217;t fully describe the [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[12,18],"tags":[],"class_list":["post-226","post","type-post","status-publish","format-standard","hentry","category-computer-vision","category-kinect"],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/226","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=226"}],"version-history":[{"count":7,"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/226\/revisions"}],"predecessor-version":[{"id":243,"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/226\/revisions\/243"}],"wp:attachment":[{"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=226"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=226"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/peterabeles.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=226"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}