+44 (0) 20 8773 8111 mvcbookings@ppma.co.uk
Machine Vision Conference 2019 Speakers Logo

How to Control your Universal Robot using a Smart Vision Sensor

Amit Chohan

Baumer

 

Read more...

Presentation to outline the ease of using vision guided robotics with Universal Robots with a smart vision sensor solution. Outlining the ease of setup, a unique calibration tool with the fastest setup time.

Vision in Robotics

Simon Banks

Acrovision

 

Read more...

A growing requirement in logistics and manufacturing markets is automatic Pick and Place using Cobots / Robots and Vision. Saving time and improving accuracy / efficiency are the key drivers. However, this is not a one size fits all situation.

Acrovision will show where different vision technologies are used for different Pick and Place requirements.

Acrovision Sales Director Simon Banks presents this live demo presentation

Robot Guidance Simplified

Neil Sandhu

SICK UK

 

Read more...

A look at the various Machine Vision Technologies and how they can be easily adopted by Robotic Solutions to provide the guidance and vision, needed to carry out their tasks with reliability and accuracy. The session will look at how the differing technologies can use their specific advantages to give the most robust solutions.

Plug & Play Vision Solutions for Collaborative Robots (Cobots)

Andrew Mason

RAR UK

 

Read more...

For years, machine vision systems have provided the technology necessary for robots to see. Today, our plug and play technology is enhancing cobots – collaborative robots that work alongside humans without safety fencing – with pick-and-place, machine tending, assembly, and even complex bin-picking operations. With the choice of 2D and 3D vision to guide your cobot, the vision systems I’m going to talk about can complement and boost almost any picking task.

Merge ToF depth and 2D color data for 3D robot perception

Sanna Leinius

Basler AG

 

Read more...

Computer vision can make robots “smarter” and helps to expand their fields of application. Time-of-Flight cameras capture precise 3D depth data in real time and offer compact and robust 3D vision solutions. For some robotic applications, it is useful to merge the depth data with the RGB data from a 2D color camera. The result is a point cloud in the object’s true colors. This compensates for missing depth information, assists in classifications based on object color, or enables neural networks pre-trained on 2D color data. Listen in to learn more about 3D and 2D vision-guided robotics

Disclaimer

The Machine Vision Conference 2021 presentations housed on this web-page remain the copyright of the originator. PPMA Limited makes no claims, promises or guarantees about the accuracy, completeness or adequacy within each presentation and expressly disclaims liability for any errors, inaccuracies and omissions of its contents. Machine Vision Conference 2021 is a trading style of PPMA Limited, incorporated within England and Wales with company no. 02116954

Contact UKIVA:  T: +44 (0) 20 8773 8111     mvcbookings@ppma.co.uk

Organised by UKIVA Logo part of

&lt;p&gt;&lt;img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=308929&amp;fmt=gif" /&gt;&lt;/p&gt;<br />

&lt;p&gt;