ROS LinuxCNC Link
08 May 2021 08:14 - 08 May 2021 08:16 #208286
by Aciera
Replied by Aciera on topic ROS LinuxCNC Link
To be honest with you I'm still somewhat fuzzy about your teach-in approach. The way I understand it is that it basically "records" the Jogging movements and that recording can then be played back (maybe at different speeds?). Now don't get me wrong here, this is very cool, but if you consider that usually one will have to jog the six axis with individual buttons the jogging motion is not going to be smooth at all and thus I'll not want the played back movement of the robot to look like the jogging. IF I had an input device that was identical to the robots kinematic and I could easily record the joint angles as I moved the tool in the correct orientation along my parts (eg for welding) then that would be VERY useful. But I can't quite picture it using the standard teaching pendant. Say I pressed the wrong jog button on the pendant, do I need to start everything from the beginning?
The usual "teach-in" process is that we jog the robot to the correct position and orientation and then record that pose in the required coordinates. What the movement looks like is then defined in the program. See section 4 in the PDF I linked above.
And yes we certainly need to be able to wait for external inputs and set external outputs. In the PDF above you will also find components for all kinds of pallet geometries if the robot needs to handle parts placed in pallets and trays.
I guess what I cannot get my head around is how PROGRAMMING is actually going to look like in your application and how the "teach-in" function can be useful TOGETHER with the programming.
The usual "teach-in" process is that we jog the robot to the correct position and orientation and then record that pose in the required coordinates. What the movement looks like is then defined in the program. See section 4 in the PDF I linked above.
And yes we certainly need to be able to wait for external inputs and set external outputs. In the PDF above you will also find components for all kinds of pallet geometries if the robot needs to handle parts placed in pallets and trays.
I guess what I cannot get my head around is how PROGRAMMING is actually going to look like in your application and how the "teach-in" function can be useful TOGETHER with the programming.
Last edit: 08 May 2021 08:16 by Aciera.
The following user(s) said Thank You: Grotius
Please Log in or Create an account to join the conversation.
08 May 2021 16:37 - 08 May 2021 17:05 #208312
by Grotius
Replied by Grotius on topic ROS LinuxCNC Link
Hi Arciera,
I think i miss the pdf.
The teach in mode is as you pointed out. Forget about record feature for now.
Normally:
You jog to a position then the program point can be recorded. Then you jog to next position and record the next program point.
This is just as you pointed out.
When storing a program point i want to define values like :
1. What is the type of movement, is it a rapid, a 3d line, an 3point arc, a 3point(+) spline? Or is it teaching in a workframe?
2. Is it a startpoint, waypoint or endpoint of the above geometry, or is it a workframe point like a workframe origin?
3. What is the speed and vel of this move and if its a endpoint what must be the end velocity?
4. Must there be set a output or wait for input?
So far as i can see, this is a usuall way of teaching the robot program.
Every stored point can be edited by scrolling to it.
And you don't have to start alll over again. You can simply
start at a certain program line (teached point) and move to robot to it to edit the point.
I guess what I cannot get my head around is how PROGRAMMING is actually going to look like in your application and how the "teach-in" function can be useful TOGETHER with the programming.
This is something to think about. To prevent myself from writing every possible procedure or function i am thinking about
something more universal.
For example. If you want to make a palletizing program in "workframe_pallet".
The procedure look like :
1. Teach in pallet workframe. This new workframe will be added to combobox automaticly.
2. Move robot to pallet, and teach in the first product with gripper open, then teach in gripper closed with waitfor gripper closed signal, then teach in a relative z offset of 200mm.
( The above teached points are applied to the pallet workframe during teaching, this is selectable in combobox. )
3. Make a (int) counter for x and make a (int) counter for y each time you visit the pallet workframe, and apply xy offsets.
Now how would we code that into the gui?
1. We can teach in workframe, that is done by selecting a few comboboxes.
2. We can teach in points, set grippers etc. no problem. We do this by selecting a few combo boxes.
3. Create x,y offsets with a counter and apply these offsets to a certan previous program (Line) point or say coordinate.
To prevent myself from coding various templates, i thought what if a user can input some c code into a teach-in program line.
Say you have a pallet.comp (c file that is going to be compiled automaticly by gui)
input value counter int : x_counter, y_counter
input value offset float : x_offset, y_offset
output value position float : x_pos, y_pos
Function routine : if x_counter==5, y_counter++.
With above code you can imagine a working pallet program.
At some point you have to stitch the above code into the program.
1. You or the gui can load the compiled pallet.com => pallet.so.
2. When doing a teach-in you can select a combobox named "Function". Inside that you pick the value "pallet.x_counter".
Then in second combobox you can set the "Increment value". For example "1";
3. During teach-in when you want to use position's produced by the component, like value : pallet.x_pos you select this
pos from combobox input.
4. Maybe think about a tree widget program structure? I can imagine you want to insert a complete subprogram at once.
I think i miss the pdf.
The teach in mode is as you pointed out. Forget about record feature for now.
Normally:
You jog to a position then the program point can be recorded. Then you jog to next position and record the next program point.
This is just as you pointed out.
When storing a program point i want to define values like :
1. What is the type of movement, is it a rapid, a 3d line, an 3point arc, a 3point(+) spline? Or is it teaching in a workframe?
2. Is it a startpoint, waypoint or endpoint of the above geometry, or is it a workframe point like a workframe origin?
3. What is the speed and vel of this move and if its a endpoint what must be the end velocity?
4. Must there be set a output or wait for input?
So far as i can see, this is a usuall way of teaching the robot program.
Every stored point can be edited by scrolling to it.
And you don't have to start alll over again. You can simply
start at a certain program line (teached point) and move to robot to it to edit the point.
I guess what I cannot get my head around is how PROGRAMMING is actually going to look like in your application and how the "teach-in" function can be useful TOGETHER with the programming.
This is something to think about. To prevent myself from writing every possible procedure or function i am thinking about
something more universal.
For example. If you want to make a palletizing program in "workframe_pallet".
The procedure look like :
1. Teach in pallet workframe. This new workframe will be added to combobox automaticly.
2. Move robot to pallet, and teach in the first product with gripper open, then teach in gripper closed with waitfor gripper closed signal, then teach in a relative z offset of 200mm.
( The above teached points are applied to the pallet workframe during teaching, this is selectable in combobox. )
3. Make a (int) counter for x and make a (int) counter for y each time you visit the pallet workframe, and apply xy offsets.
Now how would we code that into the gui?
1. We can teach in workframe, that is done by selecting a few comboboxes.
2. We can teach in points, set grippers etc. no problem. We do this by selecting a few combo boxes.
3. Create x,y offsets with a counter and apply these offsets to a certan previous program (Line) point or say coordinate.
To prevent myself from coding various templates, i thought what if a user can input some c code into a teach-in program line.
Say you have a pallet.comp (c file that is going to be compiled automaticly by gui)
input value counter int : x_counter, y_counter
input value offset float : x_offset, y_offset
output value position float : x_pos, y_pos
Function routine : if x_counter==5, y_counter++.
With above code you can imagine a working pallet program.
At some point you have to stitch the above code into the program.
1. You or the gui can load the compiled pallet.com => pallet.so.
2. When doing a teach-in you can select a combobox named "Function". Inside that you pick the value "pallet.x_counter".
Then in second combobox you can set the "Increment value". For example "1";
3. During teach-in when you want to use position's produced by the component, like value : pallet.x_pos you select this
pos from combobox input.
4. Maybe think about a tree widget program structure? I can imagine you want to insert a complete subprogram at once.
Last edit: 08 May 2021 17:05 by Grotius.
Please Log in or Create an account to join the conversation.
08 May 2021 18:16 - 08 May 2021 18:36 #208322
by Aciera
Replied by Aciera on topic ROS LinuxCNC Link
I'm referring to this PDF: www.electrobit.ee/web/file_bank/Manuals/...ons-detailed_ENG.pdf
I really think you should look at it as it gives an idea about program structure for robots. There are of course others as every major manufacturer seems to have a programming language of their own. So I don't know if this is particularily good example but it's the only one I happen to know anything about.
I any case after reading your last post I get the impression that you are aiming for a "conversational" programming gui rather than just a jogging screen with a recording function.
If you look at the PDF document you get an idea of what kind of elements are used in that programming language. There are among others:
Timers (delays).
conditional loops
subroutine calls
and of course the different moves in different coordinate systems.
Now the language used in the above PDF is of course NOT GCODE. My line of thinking has always been that we would need to use GCODE to program our robots since LinuxCNC uses GCODE.
This is the reason I keep asking about how I would be able to command moves in your application using all the different coordinate systems.
Take for example the palletizing module, my idea would have been to use a remap of some G- or M-Code and create a subroutine using python. So basically adapting the LinuxCNC (GCODE)-framework to the requirements of robot application.
So say G0.1 would be a rapid linear move in JOINT-MODE
G0.2 would be a rapid linear move in XYZ-MODE
G0.3 would be a rapid linear move in TOOL-MODE
And so forth.
The idea being that you would not have to program say the palletizing modules now but that those could be crated by the able user later as remaps or subroutines.
The problem of doing this inside of LinuxCNC turned out to be that the kinematics (genserkins) that come with it are not powerful enough. Hence the idea of integrating ROS.
The most fundamental question I still have about your application is what the programming SYNTAX is going to look like.
[edit]
The used syntax will also be important to anybody who wants to write a PostProcessor for third party software (eg roboDK) to create programs for their robots. That would be very useful as those softwares can usually detect collisions by use of STL models of the robot and it's environment.
[edit2]
The most important thing is that there should be a reasonably straight forward way of extending and creating modules by way of say python or GCODE and NOT having to rebuild the whole application evertime inside the QT environment. If using the application requires the user to master the QT work flow with all the dependency and makefile business then I don't see much of a future for it.
I really think you should look at it as it gives an idea about program structure for robots. There are of course others as every major manufacturer seems to have a programming language of their own. So I don't know if this is particularily good example but it's the only one I happen to know anything about.
I any case after reading your last post I get the impression that you are aiming for a "conversational" programming gui rather than just a jogging screen with a recording function.
If you look at the PDF document you get an idea of what kind of elements are used in that programming language. There are among others:
Timers (delays).
conditional loops
subroutine calls
and of course the different moves in different coordinate systems.
Now the language used in the above PDF is of course NOT GCODE. My line of thinking has always been that we would need to use GCODE to program our robots since LinuxCNC uses GCODE.
This is the reason I keep asking about how I would be able to command moves in your application using all the different coordinate systems.
Take for example the palletizing module, my idea would have been to use a remap of some G- or M-Code and create a subroutine using python. So basically adapting the LinuxCNC (GCODE)-framework to the requirements of robot application.
So say G0.1 would be a rapid linear move in JOINT-MODE
G0.2 would be a rapid linear move in XYZ-MODE
G0.3 would be a rapid linear move in TOOL-MODE
And so forth.
The idea being that you would not have to program say the palletizing modules now but that those could be crated by the able user later as remaps or subroutines.
The problem of doing this inside of LinuxCNC turned out to be that the kinematics (genserkins) that come with it are not powerful enough. Hence the idea of integrating ROS.
The most fundamental question I still have about your application is what the programming SYNTAX is going to look like.
[edit]
The used syntax will also be important to anybody who wants to write a PostProcessor for third party software (eg roboDK) to create programs for their robots. That would be very useful as those softwares can usually detect collisions by use of STL models of the robot and it's environment.
[edit2]
The most important thing is that there should be a reasonably straight forward way of extending and creating modules by way of say python or GCODE and NOT having to rebuild the whole application evertime inside the QT environment. If using the application requires the user to master the QT work flow with all the dependency and makefile business then I don't see much of a future for it.
Last edit: 08 May 2021 18:36 by Aciera.
Please Log in or Create an account to join the conversation.
08 May 2021 19:31 - 08 May 2021 22:11 #208333
by Grotius
Replied by Grotius on topic ROS LinuxCNC Link
Hi Arciera,
The most fundamental question I still have about your application is what the programming SYNTAX is going to look like.
It would be c style. But you only need it when you need to write complex program's.Compiling, loading etc is done by the gui framework.
So you don't have to install qt creator.
A syntax like python is a no go for me.
A syntax like g-code can be applied in the future. I already have a postprocessor to converse gcode into primitives like arc, line, etc wich can be converted to something the robot can work with later on.
That would be very useful as those softwares can usually detect collisions by use of STL models of the robot and it's environment. Inside the opencascade we have this option already. Can be implemented in future.
The used syntax will also be important to anybody who wants to write a PostProcessor for third party software (eg roboDK) to create programs for their robots.
Program's like robotDK are not compitible. It can not read in a function call to a realtime module and show what is going on. Ok. I plan to setup a short example how the program syntax will look like.
The most fundamental question I still have about your application is what the programming SYNTAX is going to look like.
It would be c style. But you only need it when you need to write complex program's.Compiling, loading etc is done by the gui framework.
So you don't have to install qt creator.
A syntax like python is a no go for me.
A syntax like g-code can be applied in the future. I already have a postprocessor to converse gcode into primitives like arc, line, etc wich can be converted to something the robot can work with later on.
That would be very useful as those softwares can usually detect collisions by use of STL models of the robot and it's environment. Inside the opencascade we have this option already. Can be implemented in future.
The used syntax will also be important to anybody who wants to write a PostProcessor for third party software (eg roboDK) to create programs for their robots.
Program's like robotDK are not compitible. It can not read in a function call to a realtime module and show what is going on. Ok. I plan to setup a short example how the program syntax will look like.
Last edit: 08 May 2021 22:11 by Grotius.
Please Log in or Create an account to join the conversation.
08 May 2021 22:02 - 08 May 2021 22:04 #208340
by Grotius
Replied by Grotius on topic ROS LinuxCNC Link
Attachments:
Last edit: 08 May 2021 22:04 by Grotius.
The following user(s) said Thank You: tommylight, Aciera
Please Log in or Create an account to join the conversation.
09 May 2021 02:29 - 09 May 2021 04:56 #208352
by Grotius
Replied by Grotius on topic ROS LinuxCNC Link
Hi,
Creating and using c code in the robot program is tested ok.
It was a tiny puzzle to solve.
My implementation proposal :
What the implementation produces and compiles behind the screen looks like :
What here is done, is more orr less a administrative task. It looks like halmodule refining.
4 lines of code is not bad to load pure c code.
At last a few one liners for fun :
Now building up the treewidgets in the touch screen, will start with the most difficult one, adding c code :
Creating and using c code in the robot program is tested ok.
It was a tiny puzzle to solve.
My implementation proposal :
What the implementation produces and compiles behind the screen looks like :
What here is done, is more orr less a administrative task. It looks like halmodule refining.
4 lines of code is not bad to load pure c code.
At last a few one liners for fun :
Now building up the treewidgets in the touch screen, will start with the most difficult one, adding c code :
Last edit: 09 May 2021 04:56 by Grotius.
The following user(s) said Thank You: Aciera
Please Log in or Create an account to join the conversation.
09 May 2021 07:48 #208357
by Aciera
Replied by Aciera on topic ROS LinuxCNC Link
Thanks for the examples.
To come back to my favourite question, how would we choose the Mode of movement (ie JOINT-,XYZ-,TOOL-mode) ?
Is it determined by the parameters given?
So if I wanted to execute a Joint move (ie without using the kinematics/inversekinematics) would I write:
line1 {type{rapid} point{endpoint} joint{90,90,0,148,62,-95} velstart{20} velend{20}}
If I then wanted to execute a move in XYZ cartesian mode (ie using the kinematics/inversekinematics):
line2 {type{line} point{endpoint} cart{100,200,10) eul(0,45,-45} velstart{20} velend{20}}
Is it possible to simulate the program before running it on the physical robot using a graphical STL model?
To come back to my favourite question, how would we choose the Mode of movement (ie JOINT-,XYZ-,TOOL-mode) ?
Is it determined by the parameters given?
So if I wanted to execute a Joint move (ie without using the kinematics/inversekinematics) would I write:
line1 {type{rapid} point{endpoint} joint{90,90,0,148,62,-95} velstart{20} velend{20}}
If I then wanted to execute a move in XYZ cartesian mode (ie using the kinematics/inversekinematics):
line2 {type{line} point{endpoint} cart{100,200,10) eul(0,45,-45} velstart{20} velend{20}}
Is it possible to simulate the program before running it on the physical robot using a graphical STL model?
The following user(s) said Thank You: Grotius
Please Log in or Create an account to join the conversation.
09 May 2021 08:51 - 09 May 2021 09:07 #208359
by Grotius
Replied by Grotius on topic ROS LinuxCNC Link
Hi Arciera,
Yes, you got it. I see your rapid move example has a initial velocity and a end velocity of 20.
If i see your example, i understand the syntax at first sight. The software know's what to do with these lines.
No need to choose mode.
Do you have a working robot halfile for mesa already?
Is it possible to simulate the program before running it on the physical robot using a graphical STL model?
Yes, it has the opencascade simulator in the touch screen tab. Later on i will finish that implementation.
I will be super easy to setup your robot model trough the touch screen mode.
For info:
The final programming line wich will be stored in the 1ms files are auto-completed during preprocessing.
I am really excited when i do a first run on real hardware. It will take some time to work trough the project now.
Yes, you got it. I see your rapid move example has a initial velocity and a end velocity of 20.
If i see your example, i understand the syntax at first sight. The software know's what to do with these lines.
No need to choose mode.
Do you have a working robot halfile for mesa already?
Is it possible to simulate the program before running it on the physical robot using a graphical STL model?
Yes, it has the opencascade simulator in the touch screen tab. Later on i will finish that implementation.
I will be super easy to setup your robot model trough the touch screen mode.
For info:
The final programming line wich will be stored in the 1ms files are auto-completed during preprocessing.
I am really excited when i do a first run on real hardware. It will take some time to work trough the project now.
Last edit: 09 May 2021 09:07 by Grotius.
The following user(s) said Thank You: tommylight
Please Log in or Create an account to join the conversation.
09 May 2021 11:42 - 09 May 2021 11:43 #208368
by Aciera
Replied by Aciera on topic ROS LinuxCNC Link
I do have a working hal file but it is not for a mesa board. Since I use Mitsubishi drives with an optical SSCNET bus I use dmitrys nyx-branch to be able to use his PCI-interface card. : github.com/yur7aev/linuxcnc/tree/nyx_master
So really since I'll need LinuxCNC with the SSCNET driver to run my hardware I don't know if I'll ever be able to get my robot to run on your application.
But here is my config anyway:
[edit]
I have removed the rather large STL model files.
So really since I'll need LinuxCNC with the SSCNET driver to run my hardware I don't know if I'll ever be able to get my robot to run on your application.
But here is my config anyway:
[edit]
I have removed the rather large STL model files.
Last edit: 09 May 2021 11:43 by Aciera.
The following user(s) said Thank You: tommylight, Grotius
Please Log in or Create an account to join the conversation.
09 May 2021 13:56 #208372
by Grotius
Replied by Grotius on topic ROS LinuxCNC Link
Attachments:
Please Log in or Create an account to join the conversation.
Time to create page: 0.233 seconds