#### Filter Results:

#### Publication Year

1999

2001

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

This paper presents an improved neural computation where scheme for kinematic control of redundant manipulators based on infinity-norm joint velocity minimization. Compared with a previous neural network approach to minimum infinity-non kinematic control, the present approach is less complex in terms of cost of architecture. The recurrent neural network… (More)

This paper presents two neural network approaches to real-time joint torque optimization for kinematically redundant manipulators. Two recurrent neural networks are proposed for determining the minimum driving joint torques of redundant manipulators for the eases without and with taking the joint torque limits into consideration, respectively. The first… (More)

In this paper, a recurrent neural network called the La-grangian network is applied for obstacle avoidance in kinematically redundant manipulators. Conventional numerical methods implemented in digital computers for obstacle avoidance redundancy resolution calculation could only compute the solution in milliseconds while neural network realized by hardware… (More)

- Wai Sum Tang
- ICRA
- 2001

A discrete-time recurrent neural network which is called the discrete-time Lagrangian network is proposed in this letter for solving convex quadratic programs. It is developed based on the classical Lagrange optimization method and solves quadratic programs without using any penalty parameter. The condition for the neural network to globally converge to the… (More)

- Wai Sum Tang, Jun Wang, Yangsheng Xu
- 1999

A recurrent neural network is applied for minimizing the infinity-norm of joint torques in redundant ma-nipulators. The recurrent neural network explicitly minimizes the maximum component of joint torques in magnitude while keeping the relation between the joint torque and the end-effector acceleration satisfied. The end-effector accelerations are given to… (More)

- ‹
- 1
- ›