Microsoft robotics program


















Project Bonsai is a low-code AI development platform, now available in preview. Engineers can design intelligent controllers to create autonomous systems that sense and adapt to changing environments. Build solutions that adapt to changing environments, tackle complex processes, and combine human and machine intelligence. Leverage our network of experienced system integration partners and simulation solutions to help you build intelligent autonomy into your industrial processes.

Start driving digital transformation and innovation with new skills and ideas. Engage with community forums, find consulting resources, and explore quickstart guides to start using the Bonsai platform.

Find resources, insights, and in-depth case studies from industry leaders who share their digital transformation experiences. Learn about the latest AI innovations with projects, demonstrations, code samples, and experiments at AI Lab.

Autonomous systems with Microsoft AI Get started with autonomous systems and learn how to drive impact with our free autonomous systems starter kit. Start the free ebook series. Ebook series: Moving from automated to autonomous Learn how autonomous systems are driving real-world innovation from concept to reality. Ebook How to partner your people with AI Autonomous systems can help human operators by providing guidance to workers as they navigate new and complex challenges.

Ebook Autonomous systems on the factory floor Autonomous solutions are the next step towards safer, more productive manufacturing. Ebook Use case selection guide for manufacturers Autonomous systems present manufacturers with an opportunity to improve quality, increase productivity, and drive performance.

Going beyond automation Learn how you can evolve from automation to human-trained autonomous systems with best practices and practical guidance to get started. The first simulation in this folder, Basic Simulation Environment, will render a scene similar to Figure 1. You might think that the basic simulation includes only two entities: a world globe and a box. In fact, the simulation scene also includes entities that represent the main camera, the ground, sky, and sun.

Edit mode, shown in Figure 2 , includes a left-hand pane where you can modify properties associated with each entity.

These properties control everything from the name of the entity to its position in the simulation environment. Furthermore, they also allow you to precisely control how the entity is rendered, affecting how it appears in the simulation.

If you return to Run mode, you can move around the simulation by using the mouse or arrow keys. This changes the point of view for the main camera, which is your view into the simulation. Another important point to mention is that VSE allows you to render the scene in different modes. Visual mode is the default and is a realistic view of the simulation scene. The No Rendering option is included because rendering is just one aspect of a simulation.

What is most valuable about running a simulation is the interaction among various entities. Since rendering entities within a simulation scene is expensive in terms of resources, the No Rendering option can be useful when there are a large number of entities involved. An entity type allows you to define a new instance of a particular type of entity. For example, the world globe included in the basic simulation environment is a single shape entity type. The entity type acts as a template for the new entity in that it specifies properties associated with a particular type of entity.

The values for these properties can be changed once the entity is created, but the entity type defines which properties are included. VSE requires an entity type in order to add an entity to a simulation. This will bring up the New Entity dialog box see Figure 3.

This means you can add a new Create robot to your simulation by simply using the New Entity dialog box. MSRS offers more than one way to create and work with simulations. Creating a new DSS service using the template will result in the creation of two class files. The implementation class, which by default has the same name as the project, is where you will add code to create a new entity.

Simulations created programmatically will require access to assemblies not included with the Simple DSS Service template. Therefore, you will need to add references to the assemblies listed in Figure 5. After adding the references, you will need to add the following namespace declarations to the implementation class file:.

To understand what code is needed for your simulation, you should first examine the simulation tutorials provided with MSRS. For example, the basic simulation environment is the same as the SimulationTutorial1 project. If you open the SimulationTutorial1 project using Visual Studio , you can view the code used to create the basic simulation environment. The first thing to notice is the Start method, which is called automatically when a service is started:.

The Start method is where you add code to define your simulation environment. In addition to the main camera, the basic simulation environment contains entities used to represent the sky, ground, box, and world globe. The code to insert the world globe, or textured sphere, is listed in Figure 6. This type represents an entity with a single geometric shape, such as a sphere, and it is useful when you need to add an entity with a very simple physical geometry.

In this case, I am creating an entity with a mass of 10 kilograms, or approximately four and one half pounds. The mesh assigned to this entity is an object file, which has an. The object file is created with a 3D graphical editing tool and exported into an alias object format.

MSRS requires that the mesh object file is in this format. The last thing you do in the AddTexturedSphere method is to insert the sphere entity into the simulation environment. Now let's create a new robot entity to represent the Boe-Bot by Parallax. The Boe-Bot is a small wheeled robot that supports a two-wheel differential drive system for a photo, see Figure 7. For more information about the Boe-Bot, visit the Parallax Web site at parallax.

This means the MSRS installation includes the basic services used to operate the Boe-Bot's drive system and built-in contact sensors. By deriving from the DifferentialDriveEntity class, I'm able to reuse code that defines how the Boe-Bot should behave when it is moving through a simulation.

The code used to create the BoeBot entity type is shown in Figure 8. The constructor for the BoeBot class is used to set values for several variables defined in the DifferentialDriveEntity class. For example, the Mass is set with a value of 0. Additionally, the Boe-Bot chassis is defined in terms of the width, length, and height. These measurements were obtained by weighing the actual robot and measuring it using a metric tape measure. The position of the Boe-Bot is defined through a set of coordinates that are passed in when the entity is created.

These coordinates represent the X, Y, and Z axis points. The MSRS simulation engine uses a right-handed coordinate system, which affects the direction toward which the Z axis points.

The BoeBot constructor also defines the position of the chassis and the wheels within the entity. The DifferentialDriveSystem class makes the assumption that your robot will have two main wheels and a small rear wheel that is mostly used for balancing.

Power will be assigned to the left and right motors, which control the main wheels. The difference between the power levels assigned to each wheel determines whether it moves forward, backward, left, or right. This is the same method used to drive the physical robot. What makes the simulation so attractive is that, in theory, it does not matter whether your robot is virtual or physical—the code used to power the robot and receive data from the sensors will be the same.

Some of the code that is used in the simulation project can be reused when working with the actual robot. Now, you may have noticed that I said "in theory.

The simulation cannot account for noise—the stuff you do not expect, such as obstacles being in the wrong location. What the simulation can do is give you a fairly realistic opportunity to experiment with a new robot design or simulate the interaction of multiple robots. This can be very useful in academic environments where resources are limited and the number of students is high.

Each entity can be associated with a mesh, which is what makes the entity appear realistic. For example, in the case of the globe, the mesh is what makes the globe entity appear like the planet Earth. Strictly speaking, it is not necessary to associate an entity with a mesh, but in the case of complex entities, such as robots, a mesh object is preferable.

Just about any 3D graphical editing tool can be used to create the mesh. Use variables and conditionals, launch apps, copy and paste, and more. Save time by capturing your mouse and keyboard actions and importing them directly into the visual designer.

Initiate desktop flows or respond to prompts in attended mode, or choose the available unattended mode to run your desktop flow autonomously in the background. Run desktop flows from available cloud flows, and use AI Builder to take advantage of hyperautomation.

Automate personal desktop flows on your machine for free using the app, available in the Start menu. Experience everything Power Automate offers with access to all flows, add-ons, and apps. Power Automate contact sales widget. Power Automate. Automate your desktop with robotic process automation Connect old and new systems and reduce repetitive tasks using UI-based automation with desktop flows—the robotic process automation RPA capability in Power Automate.

Start free Watch an overview. Why use robotic process automation. Streamline your business processes with RPA. Read the story.

Read the report. Explore RPA resources. Read the blog.



0コメント

  • 1000 / 1000