Search results for: Hu Jiaxin
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2

Search results for: Hu Jiaxin

2 Optimum Parameter of a Viscous Damper for Seismic and Wind Vibration

Authors: Soltani Amir, Hu Jiaxin

Abstract:

Determination of optimal parameters of a passive  control system device is the primary objective of this study.  Expanding upon the use of control devices in wind and earthquake  hazard reduction has led to development of various control systems.  The advantage of non-linearity characteristics in a passive control  device and the optimal control method using LQR algorithm are  explained in this study. Finally, this paper introduces a simple  approach to determine optimum parameters of a nonlinear viscous  damper for vibration control of structures. A MATLAB program is  used to produce the dynamic motion of the structure considering the  stiffness matrix of the SDOF frame and the non-linear damping  effect. This study concluded that the proposed system (variable  damping system) has better performance in system response control  than a linear damping system. Also, according to the energy  dissipation graph, the total energy loss is greater in non-linear  damping system than other systems.

 

Keywords: Passive Control System, Damping Devices, Viscous Dampers, Control Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3542
1 Towards Growing Self-Organizing Neural Networks with Fixed Dimensionality

Authors: Guojian Cheng, Tianshi Liu, Jiaxin Han, Zheng Wang

Abstract:

The competitive learning is an adaptive process in which the neurons in a neural network gradually become sensitive to different input pattern clusters. The basic idea behind the Kohonen-s Self-Organizing Feature Maps (SOFM) is competitive learning. SOFM can generate mappings from high-dimensional signal spaces to lower dimensional topological structures. The main features of this kind of mappings are topology preserving, feature mappings and probability distribution approximation of input patterns. To overcome some limitations of SOFM, e.g., a fixed number of neural units and a topology of fixed dimensionality, Growing Self-Organizing Neural Network (GSONN) can be used. GSONN can change its topological structure during learning. It grows by learning and shrinks by forgetting. To speed up the training and convergence, a new variant of GSONN, twin growing cell structures (TGCS) is presented here. This paper first gives an introduction to competitive learning, SOFM and its variants. Then, we discuss some GSONN with fixed dimensionality, which include growing cell structures, its variants and the author-s model: TGCS. It is ended with some testing results comparison and conclusions.

Keywords: Artificial neural networks, Competitive learning, Growing cell structures, Self-organizing feature maps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504