Search results for: Brain Machine Interfaces
786 Performance Evaluation of Parallel Surface Modeling and Generation on Actual and Virtual Multicore Systems
Authors: Nyeng P. Gyang
Abstract:
Even though past, current and future trends suggest that multicore and cloud computing systems are increasingly prevalent/ubiquitous, this class of parallel systems is nonetheless underutilized, in general, and barely used for research on employing parallel Delaunay triangulation for parallel surface modeling and generation, in particular. The performances, of actual/physical and virtual/cloud multicore systems/machines, at executing various algorithms, which implement various parallelization strategies of the incremental insertion technique of the Delaunay triangulation algorithm, were evaluated. T-tests were run on the data collected, in order to determine whether various performance metrics differences (including execution time, speedup and efficiency) were statistically significant. Results show that the actual machine is approximately twice faster than the virtual machine at executing the same programs for the various parallelization strategies. Results, which furnish the scalability behaviors of the various parallelization strategies, also show that some of the differences between the performances of these systems, during different runs of the algorithms on the systems, were statistically significant. A few pseudo superlinear speedup results, which were computed from the raw data collected, are not true superlinear speedup values. These pseudo superlinear speedup values, which arise as a result of one way of computing speedups, disappear and give way to asymmetric speedups, which are the accurate kind of speedups that occur in the experiments performed.Keywords: Cloud computing systems, multicore systems, parallel delaunay triangulation, parallel surface modeling and generation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 879785 Investigation of the Operational Principle and Flow Analysis of a Newly Developed Dry Separator
Authors: Sung Uk Park, Young Su Kang, Sangmo Kang, Yong Kweon Suh
Abstract:
Mineral product, waste concrete (fine aggregates), waste in the optical field, industry, and construction employ separators to separate solids and classify them according to their size. Various sorting machines are used in the industrial field such as those operating under electrical properties, centrifugal force, wind power, vibration, and magnetic force. Study on separators has been carried out to contribute to the environmental industry. In this study, we perform CFD analysis for understanding the basic mechanism of the separation of waste concrete (fine aggregate) particles from air with a machine built with a rotor with blades. In CFD, we first performed two-dimensional particle tracking for various particle sizes for the model with 1 degree, 1.5 degree, and 2 degree angle between each blade to verify the boundary conditions and the method of rotating domain method to be used in 3D. Then we developed 3D numerical model with ANSYS CFX to calculate the air flow and track the particles. We judged the capability of particle separation for given size by counting the number of particles escaping from the domain toward the exit among 10 particles issued at the inlet. We confirm that particles experience stagnant behavior near the exit of the rotating blades where the centrifugal force acting on the particles is in balance with the air drag force. It was also found that the minimum particle size that can be separated by the machine with the rotor is determined by its capability to stay at the outlet of the rotor channels.Keywords: Environmental industry, Separator, CFD, Fine aggregate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1807784 Effect of Injection Moulding Process Parameter on Tensile Strength Using Taguchi Method
Authors: Gurjeet Singh, M. K. Pradhan, Ajay Verma
Abstract:
The plastic industry plays very important role in the economy of any country. It is generally among the leading share of the economy of the country. Since metals and their alloys are very rarely available on the earth. Therefore, to produce plastic products and components, which finds application in many industrial as well as household consumer products is beneficial. Since 50% plastic products are manufactured by injection moulding process. For production of better quality product, we have to control quality characteristics and performance of the product. The process parameters plays a significant role in production of plastic, hence the control of process parameter is essential. In this paper the effect of the parameters selection on injection moulding process has been described. It is to define suitable parameters in producing plastic product. Selecting the process parameter by trial and error is neither desirable nor acceptable, as it is often tends to increase the cost and time. Hence, optimization of processing parameter of injection moulding process is essential. The experiments were designed with Taguchi’s orthogonal array to achieve the result with least number of experiments. Plastic material polypropylene is studied. Tensile strength test of material is done on universal testing machine, which is produced by injection moulding machine. By using Taguchi technique with the help of MiniTab-14 software the best value of injection pressure, melt temperature, packing pressure and packing time is obtained. We found that process parameter packing pressure contribute more in production of good tensile plastic product.
Keywords: Injection moulding, tensile strength, Taguchi method, poly-propylene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3764783 The Effects of Shot and Grit Blasting Process Parameters on Steel Pipes Coating Adhesion
Authors: Saeed Khorasanizadeh
Abstract:
Adhesion strength of exterior or interior coating of steel pipes is too important. Increasing of coating adhesion on surfaces can increase the life time of coating, safety factor of transmitting line pipe and decreasing the rate of corrosion and costs. Preparation of steel pipe surfaces before doing the coating process is done by shot and grit blasting. This is a mechanical way to do it. Some effective parameters on that process, are particle size of abrasives, distance to surface, rate of abrasive flow, abrasive physical properties, shapes, selection of abrasive, kind of machine and its power, standard of surface cleanness degree, roughness, time of blasting and weather humidity. This search intended to find some better conditions which improve the surface preparation, adhesion strength and corrosion resistance of coating. So, this paper has studied the effect of varying abrasive flow rate, changing the abrasive particle size, time of surface blasting on steel surface roughness and over blasting on it by using the centrifugal blasting machine. After preparation of numbers of steel samples (according to API 5L X52) and applying epoxy powder coating on them, to compare strength adhesion of coating by Pull-Off test. The results have shown that, increasing the abrasive particles size and flow rate, can increase the steel surface roughness and coating adhesion strength but increasing the blasting time can do surface over blasting and increasing surface temperature and hardness too, change, decreasing steel surface roughness and coating adhesion strength.Keywords: surface preparation, abrasive particles, adhesionstrength
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9077782 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning
Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar
Abstract:
As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling. The research proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling. The paper concludes the challenges and improvement directions for Deep Reinforcement Learning-based resource scheduling algorithms.
Keywords: Resource scheduling, deep reinforcement learning, distributed system, artificial intelligence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 496781 Continuous Feature Adaptation for Non-Native Speech Recognition
Authors: Y. Deng, X. Li, C. Kwan, B. Raj, R. Stern
Abstract:
The current speech interfaces in many military applications may be adequate for native speakers. However, the recognition rate drops quite a lot for non-native speakers (people with foreign accents). This is mainly because the nonnative speakers have large temporal and intra-phoneme variations when they pronounce the same words. This problem is also complicated by the presence of large environmental noise such as tank noise, helicopter noise, etc. In this paper, we proposed a novel continuous acoustic feature adaptation algorithm for on-line accent and environmental adaptation. Implemented by incremental singular value decomposition (SVD), the algorithm captures local acoustic variation and runs in real-time. This feature-based adaptation method is then integrated with conventional model-based maximum likelihood linear regression (MLLR) algorithm. Extensive experiments have been performed on the NATO non-native speech corpus with baseline acoustic model trained on native American English. The proposed feature-based adaptation algorithm improved the average recognition accuracy by 15%, while the MLLR model based adaptation achieved 11% improvement. The corresponding word error rate (WER) reduction was 25.8% and 2.73%, as compared to that without adaptation. The combined adaptation achieved overall recognition accuracy improvement of 29.5%, and WER reduction of 31.8%, as compared to that without adaptation. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3217780 Analysis of Hard Turning Process of AISI D3-Thermal Aspects
Authors: B. Varaprasad, C. Srinivasa Rao
Abstract:
In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of hard turning by using commercial software DEFORM 3D has been compared to experimental results of stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.Keywords: Hard-turning, computer-aided engineering, computational machining, finite element method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1353779 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values
Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi
Abstract:
A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.
Keywords: eXtreme Gradient Boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impairment, multiclass classification, ADNI, support vector machine, random forest.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958778 Design and Fabrication of a Programmable Stiffness-Sensitive Gripper for Object Handling
Authors: Mehdi Modabberifar, Sanaz Jabary, Mojtaba Ghodsi
Abstract:
Stiffness sensing is an important issue in medical diagnostic, robotics surgery, safe handling, and safe grasping of objects in production lines. Detecting and obtaining the characteristics in dwelling lumps embedded in a soft tissue and safe removing and handling of detected lumps is needed in surgery. Also in industry, grasping and handling an object without damaging in a place where it is not possible to access a human operator is very important. In this paper, a method for object handling is presented. It is based on the use of an intelligent gripper to detect the object stiffness and then setting a programmable force for grasping the object to move it. The main components of this system includes sensors (sensors for measuring force and displacement), electrical (electrical and electronic circuits, tactile data processing and force control system), mechanical (gripper mechanism and driving system for the gripper) and the display unit. The system uses a rotary potentiometer for measuring gripper displacement. A microcontroller using the feedback received by the load cell, mounted on the finger of the gripper, calculates the amount of stiffness, and then commands the gripper motor to apply a certain force on the object. Results of Experiments on some samples with different stiffness show that the gripper works successfully. The gripper can be used in haptic interfaces or robotic systems used for object handling.Keywords: Gripper, haptic, stiffness, robotic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1155777 A Modularized Design for Multi-Drivers Off-Road Vehicle Driving-Line and its Performance Assessment
Authors: Yi Jianjun, Sun Yingce, Hu Diqing, Li Chenggang
Abstract:
Modularized design approach can facilitate the modeling of complex systems and support behavior analysis and simulation in an iterative and thus complex engineering process, by using encapsulated submodels of components and of their interfaces. Therefore it can improve the design efficiency and simplify the solving complicated problem. Multi-drivers off-road vehicle is comparatively complicated. Driving-line is an important core part to a vehicle; it has a significant contribution to the performance of a vehicle. Multi-driver off-road vehicles have complex driving-line, so its performance is heavily dependent on the driving-line. A typical off-road vehicle-s driving-line system consists of torque converter, transmission, transfer case and driving-axles, which transfer the power, generated by the engine and distribute it effectively to the driving wheels according to the road condition. According to its main function, this paper puts forward a modularized approach for designing and evaluation of vehicle-s driving-line. It can be used to effectively estimate the performance of driving-line during concept design stage. Through appropriate analysis and assessment method, an optimal design can be reached. This method has been applied to the practical vehicle design, it can improve the design efficiency and is convenient to assess and validate the performance of a vehicle, especially of multi-drivers off-road vehicle.Keywords: Heavy-loaded Off-road Vehicle, Power Driving-line, Modularized Design, Performance Assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849776 Learning Human-Like Color Categorization through Interaction
Authors: Rinaldo Christian Tanumara, Ming Xie, Chi Kit Au
Abstract:
Human perceives color in categories, which may be identified using color name such as red, blue, etc. The categorization is unique for each human being. However despite the individual differences, the categorization is shared among members in society. This allows communication among them, especially when using color name. Sociable robot, to live coexist with human and become part of human society, must also have the shared color categorization, which can be achieved through learning. Many works have been done to enable computer, as brain of robot, to learn color categorization. Most of them rely on modeling of human color perception and mathematical complexities. Differently, in this work, the computer learns color categorization through interaction with humans. This work aims at developing the innate ability of the computer to learn the human-like color categorization. It focuses on the representation of color categorization and how it is built and developed without much mathematical complexity.Keywords: Color categorization, color learning, machinelearning, color naming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529775 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital
Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri
Abstract:
Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.Keywords: Systems modeling, ED operation, workflow modeling, systems analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1042774 Effect of High Injection Pressure on Mixture Formation, Burning Process and Combustion Characteristics in Diesel Combustion
Authors: Amir Khalid, B. Manshoor
Abstract:
The mixture formation prior to the ignition process plays as a key element in the diesel combustion. Parametric studies of mixture formation and ignition process in various injection parameter has received considerable attention in potential for reducing emissions. Purpose of this study is to clarify the effects of injection pressure on mixture formation and ignition especially during ignition delay period, which have to be significantly influences throughout the combustion process and exhaust emissions. This study investigated the effects of injection pressure on diesel combustion fundamentally using rapid compression machine. The detail behavior of mixture formation during ignition delay period was investigated using the schlieren photography system with a high speed camera. This method can capture spray evaporation, spray interference, mixture formation and flame development clearly with real images. Ignition process and flame development were investigated by direct photography method using a light sensitive high-speed color digital video camera. The injection pressure and air motion are important variable that strongly affect to the fuel evaporation, endothermic and prolysis process during ignition delay. An increased injection pressure makes spray tip penetration longer and promotes a greater amount of fuel-air mixing occurs during ignition delay. A greater quantity of fuel prepared during ignition delay period thus predominantly promotes more rapid heat release.Keywords: Mixture Formation, Diesel Combustion, Ignition Process, Spray, Rapid Compression Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2843773 Changes in EEG and HRV during Event-Related Attention
Authors: Sun K. Yoo, Chung K. Lee
Abstract:
Determination of attentional status is important because working performance and an unexpected accident is highly related with the attention. The autonomic nervous and the central nervous systems can reflect the changes in person’s attentional status. Reduced number of suitable pysiological parameters among autonomic and central nervous systems related signal parameters will be critical in optimum design of attentional devices. In this paper, we analyze the EEG (Electroencephalography) and HRV (Heart Rate Variability) signals to demonstrate the effective relation with brain signal and cardiovascular signal during event-related attention, which will be later used in selecting the minimum set of attentional parameters. Time and frequency domain parameters from HRV signal and frequency domain parameters from EEG signal are used as input to the optimum feature parameters selector.
Keywords: EEG, HRV, attentional status.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2790772 A New Version of Annotation Method with a XML-based Knowledge Base
Authors: Mohammad Yasrebi, Somayeh Khosravi
Abstract:
Machine-understandable data when strongly interlinked constitutes the basis for the SemanticWeb. Annotating web documents is one of the major techniques for creating metadata on the Web. Annotating websitexs defines the containing data in a form which is suitable for interpretation by machines. In this paper, we present a better and improved approach than previous [1] to annotate the texts of the websites depends on the knowledge base.Keywords: Knowledge base, ontology, semantic annotation, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570771 Improvement of Overall Equipment Effectiveness through Total Productive Maintenance
Abstract:
Frequent machine breakdowns, low plant availability and increased overtime are a great threat to a manufacturing plant as they increase operating costs of an industry. The main aim of this study was to improve Overall Equipment Effectiveness (OEE) at a manufacturing company through the implementation of innovative maintenance strategies. A case study approach was used. The paper focuses on improving the maintenance in a manufacturing set up using an innovative maintenance regime mix to improve overall equipment effectiveness. Interviews, reviewing documentation and historical records, direct and participatory observation were used as data collection methods during the research. Usually production is based on the total kilowatt of motors produced per day. The target kilowatt at 91% availability is 75 Kilowatts a day. Reduced demand and lack of raw materials particularly imported items are adversely affecting the manufacturing operations. The company had to reset its targets from the usual figure of 250 Kilowatt per day to mere 75 per day due to lower availability of machines as result of breakdowns as well as lack of raw materials. The price reductions and uncertainties as well as general machine breakdowns further lowered production. Some recommendations were given. For instance, employee empowerment in the company will enhance responsibility and authority to improve and totally eliminate the six big losses. If the maintenance department is to realise its proper function in a progressive, innovative industrial society, then its personnel must be continuously trained to meet current needs as well as future requirements. To make the maintenance planning system effective, it is essential to keep track of all the corrective maintenance jobs and preventive maintenance inspections. For large processing plants these cannot be handled manually. It was therefore recommended that the company implement (Computerised Maintenance Management System) CMMS.
Keywords: Maintenance, Manufacturing, Overall Equipment Effectiveness
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3988770 Independent Component Analysis to Mass Spectra of Aluminium Sulphate
Authors: M. Heikkinen, A. Sarpola, H. Hellman, J. Rämö, Y. Hiltunen
Abstract:
Independent component analysis (ICA) is a computational method for finding underlying signals or components from multivariate statistical data. The ICA method has been successfully applied in many fields, e.g. in vision research, brain imaging, geological signals and telecommunications. In this paper, we apply the ICA method to an analysis of mass spectra of oligomeric species emerged from aluminium sulphate. Mass spectra are typically complex, because they are linear combinations of spectra from different types of oligomeric species. The results show that ICA can decomposite the spectral components for useful information. This information is essential in developing coagulation phases of water treatment processes.
Keywords: Independent component analysis, massspectroscopy, water treatment, aluminium sulphate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370769 Integrating Hedgerow into Town Planning: A Framework for Sustainable Residential Development
Authors: Siqing Chen
Abstract:
The vast rural landscape in the southern United States is conspicuously characterized by the hedgerow trees or groves. The patchwork landscape of fields surrounded by high hedgerows is a traditional and familiar feature of the American countryside. Hedgerows are in effect linear strips of trees, groves, or woodlands, which are often critical habitats for wildlife and important for the visual quality of the landscape. As landscape interfaces, hedgerows define the spaces in the landscape, give the landscape life and meaning, and enrich ecologies and cultural heritages of the American countryside. Although hedgerows were originally intended as fences and to mark property and townland boundaries, they are not merely the natural or man-made additions to the landscape--they have gradually become “naturalized" into the landscape, deeply rooted in the rural culture, and now formed an important component of the southern American rural environment. However, due to the ever expanding real estate industry and high demand for new residential development, substantial areas of authentic hedgerow landscape in the southern United States are being urbanized. Using Hudson Farm as an example, this study illustrated guidelines of how hedgerows can be integrated into town planning as green infrastructure and landscape interface to innovate and direct sustainable land use, and suggest ways in which such vernacular landscapes can be preserved and integrated into new development without losing their contextual inspiration.Keywords: Hedgerow, Town planning, Sustainable design, Ecological infrastructure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671768 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.
Keywords: Automation, human factors, air traffic controller, MINIMA, OOTL, Out-Of-The-Loop, EEG, electroencephalography, HMI, human machine interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452767 Management Software for the Elaboration of an Electronic File in the Pharmaceutical Industry Following Mexican Regulations
Authors: M. Peña Aguilar Juan, Ríos Hernández Ezequiel, R. Valencia Luis
Abstract:
For certification, certain goods of public interest, such as medicines and food, it is required the preparation and delivery of a dossier. For its elaboration, legal and administrative knowledge must be taken, as well as organization of the documents of the process, and an order that allows the file verification. Therefore, a virtual platform was developed to support the process of management and elaboration of the dossier, providing accessibility to the information and interfaces that allow the user to know the status of projects. The development of dossier system on the cloud allows the inclusion of the technical requirements for the software management, including the validation and the manufacturing in the field industry. The platform guides and facilitates the dossier elaboration (report, file or history), considering Mexican legislation and regulations, it also has auxiliary tools for its management. This technological alternative provides organization support for documents and accessibility to the information required to specify the successful development of a dossier. The platform divides into the following modules: System control, catalog, dossier and enterprise management. The modules are designed per the structure required in a dossier in those areas. However, the structure allows for flexibility, as its goal is to become a tool that facilitates and does not obstruct processes. The architecture and development of the software allows flexibility for future work expansion to other fields, this would imply feeding the system with new regulations.
Keywords: Electronic dossier, technologies for management, web software, dossier elaboration, pharmaceutical industry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204766 New Graph Similarity Measurements based on Isomorphic and Nonisomorphic Data Fusion and their Use in the Prediction of the Pharmacological Behavior of Drugs
Authors: Irene Luque Ruiz, Manuel Urbano Cuadrado, Miguel Ángel Gómez-Nieto
Abstract:
New graph similarity methods have been proposed in this work with the aim to refining the chemical information extracted from molecules matching. For this purpose, data fusion of the isomorphic and nonisomorphic subgraphs into a new similarity measure, the Approximate Similarity, was carried out by several approaches. The application of the proposed method to the development of quantitative structure-activity relationships (QSAR) has provided reliable tools for predicting several pharmacological parameters: binding of steroids to the globulin-corticosteroid receptor, the activity of benzodiazepine receptor compounds, and the blood brain barrier permeability. Acceptable results were obtained for the models presented here.
Keywords: Graph similarity, Nonisomorphic dissimilarity, Approximate similarity, Drug activity prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653765 Modeling Spatial Distributions of Point and Nonpoint Source Pollution Loadings in the Great Lakes Watersheds
Authors: Chansheng He, Carlo DeMarchi
Abstract:
A physically based, spatially-distributed water quality model is being developed to simulate spatial and temporal distributions of material transport in the Great Lakes Watersheds of the U.S. Multiple databases of meteorology, land use, topography, hydrography, soils, agricultural statistics, and water quality were used to estimate nonpoint source loading potential in the study watersheds. Animal manure production was computed from tabulations of animals by zip code area for the census years of 1987, 1992, 1997, and 2002. Relative chemical loadings for agricultural land use were calculated from fertilizer and pesticide estimates by crop for the same periods. Comparison of these estimates to the monitored total phosphorous load indicates that both point and nonpoint sources are major contributors to the total nutrient loads in the study watersheds, with nonpoint sources being the largest contributor, particularly in the rural watersheds. These estimates are used as the input to the distributed water quality model for simulating pollutant transport through surface and subsurface processes to Great Lakes waters. Visualization and GIS interfaces are developed to visualize the spatial and temporal distribution of the pollutant transport in support of water management programs.
Keywords: Distributed Large Basin Runoff Model, Great LakesWatersheds, nonpoint source pollution, and point sources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1534764 Experimental Investigation on Shear Behaviour of Fibre Reinforced Concrete Beams Using Steel Fibres
Authors: G. Beulah Gnana Ananthi, A. Jaffer Sathick, M. Abirami
Abstract:
Fibre reinforced concrete (FRC) has been widely used in industrial pavements and non-structural elements such as pipes, culverts, tunnels, and precast elements. The strengthening effect of fibres in the concrete matrix is achieved primarily due to the bridging effect of fibres at the crack interfaces. The workability of the concrete was reduced on addition of high percentages of steel fibres. The optimum percentage of addition of steel fibres varies with its aspect ratio. For this study, 1% addition of steel has resulted to be the optimum percentage for both Hooked and Crimped Steel Fibres and was added to the beam specimens. The fibres restrain efficiently the cracks and take up residual stresses beyond the cracking. In this sense, diagonal cracks are effectively stitched up by fibres crossing it. The failure of beams within the shear failure range changed from shear to flexure in the presence of sufficient steel fibre quantity. The shear strength is increased with the addition of steel fibres and had exceeded the enhancement obtained with the transverse reinforcement. However, such increase is not directly in proportion with the quantity of fibres used. Considering all the clarification made in the present experimental investigation, it is concluded that 1% of crimped steel fibres with an aspect ratio of 50 is the best type of steel fibres for replacement of transverse stirrups in high strength concrete beams when compared to the steel fibres with hooked ends.
Keywords: Fibre reinforced concrete, steel fibre, shear strength, crack pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 802763 A Coupled Extended-Finite-Discrete Element Method: On the Different Contact Schemes between Continua and Discontinua
Authors: Shervin Khazaeli, Shahab Haj-zamani
Abstract:
Recently, advanced geotechnical engineering problems related to soil movement, particle loss, and modeling of local failure (i.e. discontinua) as well as modeling the in-contact structures (i.e. continua) are of the great interest among researchers. The aim of this research is to meet the requirements with respect to the modeling of the above-mentioned two different domains simultaneously. To this end, a coupled numerical method is introduced based on Discrete Element Method (DEM) and eXtended-Finite Element Method (X-FEM). In the coupled procedure, DEM is employed to capture the interactions and relative movements of soil particles as discontinua, while X-FEM is utilized to model in-contact structures as continua, which may consist of different types of discontinuities. For verification purposes, the new coupled approach is utilized to examine benchmark problems including different contacts between/within continua and discontinua. Results are validated by comparison with those of existing analytical and numerical solutions. This study proves that extended-finite-discrete element method can be used to robustly analyze not only contact problems, but also other types of discontinuities in continua such as (i) crack formations and propagations, (ii) voids and bimaterial interfaces, and (iii) combination of previous cases. In essence, the proposed method can be used vastly in advanced soil-structure interaction problems to investigate the micro and macro behaviour of the surrounding soil and the response of the embedded structure that contains discontinuities.Keywords: Contact problems, discrete element method, extended-finite element method, soil-structure interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1236762 Hybrid Authentication Scheme for Graphical Password Using QR Code and Integrated Sound Signature
Authors: Salim Istyaq, Mohammad Sarosh Umar
Abstract:
Today, the mankind is in the stage of development, every day comes with new proposal of technology, in order to secure these types of technology, we also prepare high yielding security modules to conserve these resources. The capacity of human brain to recognize anything is far more than any species; this is all due to our developing cycle of curiosity. In this paper, we proposed a scheme based on graphical password using QR Code which provides more security to the recent online system. It also contains a supportive sound signature. In this system, authentication is done using sequence of images in QR code form. Users select one click-point per image with the help of QR scanner or recognizer. The encoded phrase in a QR code emphasizes the minimum probability of attacking via shoulder surfing or other attacks.
Keywords: Graphical password, QR code, sound signature, image authentication, cued click point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 772761 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics
Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur
Abstract:
Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.Keywords: Human machine interface, industrial internet of things, internet of things, optical character recognition, video analytic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 739760 Improving Quality of Business Networks for Information Systems
Authors: Hazem M. El-Bakry, Ahmed Atwan
Abstract:
Computer networks are essential part in computerbased information systems. The performance of these networks has a great influence on the whole information system. Measuring the usability criteria and customers satisfaction on small computer network is very important. In this article, an effective approach for measuring the usability of business network in an information system is introduced. The usability process for networking provides us with a flexible and a cost-effective way to assess the usability of a network and its products. In addition, the proposed approach can be used to certify network product usability late in the development cycle. Furthermore, it can be used to help in developing usable interfaces very early in the cycle and to give a way to measure, track, and improve usability. Moreover, a new approach for fast information processing over computer networks is presented. The entire data are collected together in a long vector and then tested as a one input pattern. Proposed fast time delay neural networks (FTDNNs) use cross correlation in the frequency domain between the tested data and the input weights of neural networks. It is proved mathematically and practically that the number of computation steps required for the presented time delay neural networks is less than that needed by conventional time delay neural networks (CTDNNs). Simulation results using MATLAB confirm the theoretical computations.Keywords: Usability Criteria, Computer Networks, Fast Information Processing, Cross Correlation, Frequency Domain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035759 Curvelet Transform Based Two Class Motor Imagery Classification
Authors: Nebi Gedik
Abstract:
One of the important parts of the brain-computer interface (BCI) studies is the classification of motor imagery (MI) obtained by electroencephalography (EEG). The major goal is to provide non-muscular communication and control via assistive technologies to people with severe motor disorders so that they can communicate with the outside world. In this study, an EEG signal classification approach based on multiscale and multi-resolution transform method is presented. The proposed approach is used to decompose the EEG signal containing motor image information (right- and left-hand movement imagery). The decomposition process is performed using curvelet transform which is a multiscale and multiresolution analysis method, and the transform output was evaluated as feature data. The obtained feature set is subjected to feature selection process to obtain the most effective ones using t-test methods. SVM and k-NN algorithms are assigned for classification.
Keywords: motor imagery, EEG, curvelet transform, SVM, k-NN
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622758 Industrial Compressor Anti-Surge Computer Control
Authors: Ventzas Dimitrios, Petropoulos George
Abstract:
The paper presents a compressor anti-surge control system, that results in maximizing compressor throughput with pressure standard deviation reduction, increased safety margin between design point and surge limit line and avoiding possible machine surge. Alternative control strategies are presented.Keywords: Anti-surge, control, compressor, PID control, safety, fault tolerance, start-up, ESD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8965757 Two Lessons Learnt in Defining Intersections and Interfaces in Numerical Modeling with Plaxis
Authors: Mahdi Sadeghian, Somaye Sadeghian, Reza Dinarvand
Abstract:
This paper is going to discuss two issues encountered in using PLAXIS. Both issues were monitored during application of PLAXIS to estimate the excavation-induced displacement. Column Soil Mixing (CSM) was applied to stabilise the excavation. It was understood that the estimated excavation induced deformation at the top of the CSM blocks highly depends on the material type defining pavement material adjacent to the CSM blocks. Cohesive material for pavement will result in the unrealistic connection between pavement and CSM even by defining an interface element. To find the most realistic approach, the interface defined in three different manners (1) no interface elements were applied (2) a non-cohesive soil layer was defined between pavement and CSM block to represent the friction between these materials (3) built-in interface elements in PLAXIS was used to define the boundary between the pavement and the CSM block. The result showed that the option 2 would result in more realistic results. The second issue was in the modelling of the contact line between the CSM block and an inclined layer underneath. The analysis result showed that the excavation-induced deformation highly depends on how the PLAXIS user defines the contact area. It was understood that if the contact area had defined as a point in which CSM block had intersected the layer underneath the estimated lateral displacement of CSM block would be unrealistically lower than the model in which the contact area was defined as a line.
Keywords: PLAXIS, FEM, CSM, excavation-induced deformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 636