Search results for: precision of gearing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 891

Search results for: precision of gearing

711 Towards Automatic Calibration of In-Line Machine Processes

Authors: David F. Nettleton, Elodie Bugnicourt, Christian Wasiak, Alejandro Rosales

Abstract:

In this presentation, preliminary results are given for the modeling and calibration of two different industrial winding MIMO (Multiple Input Multiple Output) processes using machine learning techniques. In contrast to previous approaches which have typically used ‘black-box’ linear statistical methods together with a definition of the mechanical behavior of the process, we use non-linear machine learning algorithms together with a ‘white-box’ rule induction technique to create a supervised model of the fitting error between the expected and real force measures. The final objective is to build a precise model of the winding process in order to control de-tension of the material being wound in the first case, and the friction of the material passing through the die, in the second case. Case 1, Tension Control of a Winding Process. A plastic web is unwound from a first reel, goes over a traction reel and is rewound on a third reel. The objectives are: (i) to train a model to predict the web tension and (ii) calibration to find the input values which result in a given tension. Case 2, Friction Force Control of a Micro-Pullwinding Process. A core+resin passes through a first die, then two winding units wind an outer layer around the core, and a final pass through a second die. The objectives are: (i) to train a model to predict the friction on die2; (ii) calibration to find the input values which result in a given friction on die2. Different machine learning approaches are tested to build models, Kernel Ridge Regression, Support Vector Regression (with a Radial Basis Function Kernel) and MPART (Rule Induction with continuous value as output). As a previous step, the MPART rule induction algorithm was used to build an explicative model of the error (the difference between expected and real friction on die2). The modeling of the error behavior using explicative rules is used to help improve the overall process model. Once the models are built, the inputs are calibrated by generating Gaussian random numbers for each input (taking into account its mean and standard deviation) and comparing the output to a target (desired) output until a closest fit is found. The results of empirical testing show that a high precision is obtained for the trained models and for the calibration process. The learning step is the slowest part of the process (max. 5 minutes for this data), but this can be done offline just once. The calibration step is much faster and in under one minute obtained a precision error of less than 1x10-3 for both outputs. To summarize, in the present work two processes have been modeled and calibrated. A fast processing time and high precision has been achieved, which can be further improved by using heuristics to guide the Gaussian calibration. Error behavior has been modeled to help improve the overall process understanding. This has relevance for the quick optimal set up of many different industrial processes which use a pull-winding type process to manufacture fibre reinforced plastic parts. Acknowledgements to the Openmind project which is funded by Horizon 2020 European Union funding for Research & Innovation, Grant Agreement number 680820

Keywords: data model, machine learning, industrial winding, calibration

Procedia PDF Downloads 213
710 Identification of Dynamic Friction Model for High-Precision Motion Control

Authors: Martin Goubej, Tomas Popule, Alois Krejci

Abstract:

This paper deals with experimental identification of mechanical systems with nonlinear friction characteristics. Dynamic LuGre friction model is adopted and a systematic approach to parameter identification of both linear and nonlinear subsystems is given. The identification procedure consists of three subsequent experiments which deal with the individual parts of plant dynamics. The proposed method is experimentally verified on an industrial-grade robotic manipulator. Model fidelity is compared with the results achieved with a static friction model.

Keywords: mechanical friction, LuGre model, friction identification, motion control

Procedia PDF Downloads 386
709 On Optimum Stratification

Authors: M. G. M. Khan, V. D. Prasad, D. K. Rao

Abstract:

In this manuscript, we discuss the problem of determining the optimum stratification of a study (or main) variable based on the auxiliary variable that follows a uniform distribution. If the stratification of survey variable is made using the auxiliary variable it may lead to substantial gains in precision of the estimates. This problem is formulated as a Nonlinear Programming Problem (NLPP), which turn out to multistage decision problem and is solved using dynamic programming technique.

Keywords: auxiliary variable, dynamic programming technique, nonlinear programming problem, optimum stratification, uniform distribution

Procedia PDF Downloads 301
708 Pesticide Residue Determination on Cumin Plant (Nigella orientalis L.) with LC-MS/MS and GC-MS

Authors: Nilda Ersoy, Sevinç Şener, Ayşe Yalçın Elidemir, Ebru Evcil, Ergün Döğen

Abstract:

In this study, pesticide residues were investigated in black cumin (Nigella orientalis L.) seeds grown in Turkey. GC-MS and LC-MS/MS analytical instruments are used in high precision when determining residue limits. A total of 100 pesticide active ingredients in LC-MS/MS devices have been performed in Nigella orientalis L. seeds samples. Also for the same aim, 103 pesticide active ingredients were analyzed in GC-MS. This study was conducted in 2012 and 2013. Sample residues were not found in detectable levels for two years.

Keywords: pesticide, residue, black cumin, Nigella orientalis L.

Procedia PDF Downloads 370
707 A Gauge Repeatability and Reproducibility Study for Multivariate Measurement Systems

Authors: Jeh-Nan Pan, Chung-I Li

Abstract:

Measurement system analysis (MSA) plays an important role in helping organizations to improve their product quality. Generally speaking, the gauge repeatability and reproducibility (GRR) study is performed according to the MSA handbook stated in QS9000 standards. Usually, GRR study for assessing the adequacy of gauge variation needs to be conducted prior to the process capability analysis. Traditional MSA only considers a single quality characteristic. With the advent of modern technology, industrial products have become very sophisticated with more than one quality characteristic. Thus, it becomes necessary to perform multivariate GRR analysis for a measurement system when collecting data with multiple responses. In this paper, we take the correlation coefficients among tolerances into account to revise the multivariate precision-to-tolerance (P/T) ratio as proposed by Majeske (2008). We then compare the performance of our revised P/T ratio with that of the existing ratios. The simulation results show that our revised P/T ratio outperforms others in terms of robustness and proximity to the actual value. Moreover, the optimal allocation of several parameters such as the number of quality characteristics (v), sample size of parts (p), number of operators (o) and replicate measurements (r) is discussed using the confidence interval of the revised P/T ratio. Finally, a standard operating procedure (S.O.P.) to perform the GRR study for multivariate measurement systems is proposed based on the research results. Hopefully, it can be served as a useful reference for quality practitioners when conducting such study in industries. Measurement system analysis (MSA) plays an important role in helping organizations to improve their product quality. Generally speaking, the gauge repeatability and reproducibility (GRR) study is performed according to the MSA handbook stated in QS9000 standards. Usually, GRR study for assessing the adequacy of gauge variation needs to be conducted prior to the process capability analysis. Traditional MSA only considers a single quality characteristic. With the advent of modern technology, industrial products have become very sophisticated with more than one quality characteristic. Thus, it becomes necessary to perform multivariate GRR analysis for a measurement system when collecting data with multiple responses. In this paper, we take the correlation coefficients among tolerances into account to revise the multivariate precision-to-tolerance (P/T) ratio as proposed by Majeske (2008). We then compare the performance of our revised P/T ratio with that of the existing ratios. The simulation results show that our revised P/T ratio outperforms others in terms of robustness and proximity to the actual value. Moreover, the optimal allocation of several parameters such as the number of quality characteristics (v), sample size of parts (p), number of operators (o) and replicate measurements (r) is discussed using the confidence interval of the revised P/T ratio. Finally, a standard operating procedure (S.O.P.) to perform the GRR study for multivariate measurement systems is proposed based on the research results. Hopefully, it can be served as a useful reference for quality practitioners when conducting such study in industries.

Keywords: gauge repeatability and reproducibility, multivariate measurement system analysis, precision-to-tolerance ratio, Gauge repeatability

Procedia PDF Downloads 225
706 Inertial Motion Capture System for Biomechanical Analysis in Rehabilitation and Sports

Authors: Mario Sandro F. Rocha, Carlos S. Ande, Anderson A. Oliveira, Felipe M. Bersotti, Lucas O. Venzel

Abstract:

The inertial motion capture systems (mocap) are among the most suitable tools for quantitative clinical analysis in rehabilitation and sports medicine. The inertial measuring units (IMUs), composed by accelerometers, gyroscopes, and magnetometers, are able to measure spatial orientations and calculate displacements with sufficient precision for applications in biomechanical analysis of movement. Furthermore, this type of system is relatively affordable and has the advantages of portability and independence from external references. In this work, we present the last version of our inertial motion capture system, based on the foregoing technology, with a unity interface designed for rehabilitation and sports. In our hardware architecture, only one serial port is required. First, the board client must be connected to the computer by a USB cable. Next, an available serial port is configured and opened to establish the communication between the client and the application, and then the client starts scanning for the active MOCAP_S servers around. The servers play the role of the inertial measuring units that capture the movements of the body and send the data to the client, which in turn create a package composed by the ID of the server, the current timestamp, and the motion capture data defined in the client pre-configuration of the capture session. In the current version, we can measure the game rotation vector (grv) and linear acceleration (lacc), and we also have a step detector that can be abled or disabled. The grv data are processed and directly linked to the bones of the 3D model, and, along with the data of lacc and step detector, they are also used to perform the calculations of displacements and other variables shown on the graphical user interface. Our user interface was designed to calculate and present variables that are important for rehabilitation and sports, such as cadence, speed, total gait cycle, gait cycle length, obliquity and rotation, and center of gravity displacement. Our goal is to present a low-cost portable and wearable system with a friendly interface for application in biomechanics and sports, which also performs as a product of high precision and low consumption of energy.

Keywords: biomechanics, inertial sensors, motion capture, rehabilitation

Procedia PDF Downloads 113
705 Simultaneous Determination of Cefazolin and Cefotaxime in Urine by HPLC

Authors: Rafika Bibi, Khaled Khaladi, Hind Mokran, Mohamed Salah Boukhechem

Abstract:

A high performance liquid chromatographic method with ultraviolet detection at 264nm was developed and validate for quantitative determination and separation of cefazolin and cefotaxime in urine, the mobile phase consisted of acetonitrile and phosphate buffer pH4,2(15 :85) (v/v) pumped through ODB 250× 4,6 mm, 5um column at a flow rate of 1ml/min, loop of 20ul. In this condition, the validation of this technique showed that it is linear in a range of 0,01 to 10ug/ml with a good correlation coefficient ( R>0,9997), retention time of cefotaxime, cefazolin was 9.0, 10.1 respectively, the statistical evaluation of the method was examined by means of within day (n=6) and day to day (n=5) and was found to be satisfactory with high accuracy and precision.

Keywords: cefazolin, cefotaxime, HPLC, bioscience, biochemistry, pharmaceutical

Procedia PDF Downloads 333
704 High-Production Laser and Plasma Welding Technologies for High-Speed Vessels Production

Authors: V. M. Levshakov, N. A. Steshenkova, N. A. Nosyrev

Abstract:

Application of hulls processing technologies, based on high-concentrated energy sources (laser and plasma technologies), allow improve shipbuilding production. It is typical for high-speed vessels construction using steel and aluminum alloys with high precision hulls required. Report describes high-performance technologies for plasma welding (using direct current of reversed polarity), laser, and hybrid laser-arc welding of hulls structures developed by JSC “SSTC”.

Keywords: flat sections, hybrid laser-arc welding, plasma welding, plasmatron

Procedia PDF Downloads 401
703 The Determinants of Corporate Social Responsibility Disclosure Extent and Quality: The Case of Jordan

Authors: Hani Alkayed, Belal Omar, Eileen Roddy

Abstract:

This study focuses on investigating the determinants of Corporate Social Responsibility Disclosure (CSRD) extent and quality in Jordan. The study examines factors that influence CSR disclosure extent and quality, such as corporate characteristics (size, gearing, firm’s age, and industry type), corporate governance (board size, number of meetings, non-executive directors, female directors in the board, family directors in the board, foreign members, audit committee, type of external auditors, and CEO duality) and ownership structure (government ownership, institutional ownership, and ownership concentration). Legitimacy theory is utilised as the main theory for our theoretical framework. A quantitative approach is adopted for this research and content analysis technique is used to gather CSR disclosure extent and quality from the annual reports. The sample is withdrawn from the annual reports of 118 Jordanian companies over the period of 2010-2015. A CSRD index is constructed, and includes the disclosures of the following categories; environmental, human resources, product and consumers, and community involvement. A 7 point-scale measurement was developed to examine the quality of disclosure, were 0= No Disclosures, 1= General disclosures, (Non-monetary), 2= General disclosures, (Non-monetary) with pictures, charts, and graphs 3= Descriptive/ qualitative disclosures, specific details (Non-monetary), 4= Descriptive/ qualitative disclosures, specific details with pictures, charts, and graphs, 5= Numeric disclosures, full descriptions with supporting numbers, 6= Numeric disclosures, full descriptions with supporting numbers, pictures, and Charts. This study fills the gap in the literature regarding CSRD in Jordan, and the fact that all the previous studies have ignored a clear categorisation as a measurement of quality. The result shows that the extent of CSRD is higher than the quality in Jordan. Regarding the determinants of CSR disclosures, the followings were found to have a significant relationship with both extent and quality of CSRD except non-executives, were the significant relationship was found just with the extent of CSRD: board size, non-executive directors, firm’s age, foreign members on the board, number of boards meetings, the presence of audit committees, big 4, government ownership, firm’s size, industry type.

Keywords: content analysis, corporate governance, corporate social responsibility disclosure, Jordan, quality of disclosure

Procedia PDF Downloads 193
702 Performance Evaluation of Arrival Time Prediction Models

Authors: Bin Li, Mei Liu

Abstract:

Arrival time information is a crucial component of advanced public transport system (APTS). The advertisement of arrival time at stops can help reduce the waiting time and anxiety of passengers, and improve the quality of service. In this research, an experiment was conducted to compare the performance on prediction accuracy and precision between the link-based and the path-based historical travel time based model with the automatic vehicle location (AVL) data collected from an actual bus route. The research results show that the path-based model is superior to the link-based model, and achieves the best improvement on peak hours.

Keywords: bus transit, arrival time prediction, link-based, path-based

Procedia PDF Downloads 334
701 Performance Evaluation of the CSAN Pronto Point-of-Care Whole Blood Analyzer for Regular Hematological Monitoring During Clozapine Treatment

Authors: Farzana Esmailkassam, Usakorn Kunanuvat, Zahraa Mohammed Ali

Abstract:

Objective: The key barrier in Clozapine treatment of treatment-resistant schizophrenia (TRS) includes frequent bloods draws to monitor neutropenia, the main drug side effect. WBC and ANC monitoring must occur throughout treatment. Accurate WBC and ANC counts are necessary for clinical decisions to halt, modify or continue clozapine treatment. The CSAN Pronto point-of-care (POC) analyzer generates white blood cells (WBC) and absolute neutrophils (ANC) through image analysis of capillary blood. POC monitoring offers significant advantages over central laboratory testing. This study evaluated the performance of the CSAN Pronto against the Beckman DxH900 Hematology laboratory analyzer. Methods: Forty venous samples (EDTA whole blood) with varying concentrations of WBC and ANC as established on the DxH900 analyzer were tested in duplicates on three CSAN Pronto analyzers. Additionally, both venous and capillary samples were concomitantly collected from 20 volunteers and assessed on the CSAN Pronto and the DxH900 analyzer. The analytical performance including precision using liquid quality controls (QCs) as well as patient samples near the medical decision points, and linearity using a mix of high and low patient samples to create five concentrations was also evaluated. Results: In the precision study for QCs and whole blood, WBC and ANC showed CV inside the limits established according to manufacturer and laboratory acceptability standards. WBC and ANC were found to be linear across the measurement range with a correlation of 0.99. WBC and ANC from all analyzers correlated well in venous samples on the DxH900 across the tested sample ranges with a correlation of > 0.95. Mean bias in ANC obtained on the CSAN pronto versus the DxH900 was 0.07× 109 cells/L (95% L.O.A -0.25 to 0.49) for concentrations <4.0 × 109 cells/L, which includes decision-making cut-offs for continuing clozapine treatment. Mean bias in WBC obtained on the CSAN pronto versus the DxH900 was 0.34× 109 cells/L (95% L.O.A -0.13 to 0.72) for concentrations <5.0 × 109 cells/L. The mean bias was higher (-11% for ANC, 5% for WBC) at higher concentrations. The correlations between capillary and venous samples showed more variability with mean bias of 0.20 × 109 cells/L for the ANC. Conclusions: The CSAN pronto showed acceptable performance in WBC and ANC measurements from venous and capillary samples and was approved for clinical use. This testing will facilitate treatment decisions and improve clozapine uptake and compliance.

Keywords: absolute neutrophil counts, clozapine, point of care, white blood cells

Procedia PDF Downloads 56
700 An Optimal and Efficient Family of Fourth-Order Methods for Nonlinear Equations

Authors: Parshanth Maroju, Ramandeep Behl, Sandile S. Motsa

Abstract:

In this study, we proposed a simple and interesting family of fourth-order multi-point methods without memory for obtaining simple roots. This family requires only three functional evaluations (viz. two of functions f(xn), f(yn) and third one of its first-order derivative f'(xn)) per iteration. Moreover, the accuracy and validity of new schemes is tested by a number of numerical examples are also proposed to illustrate their accuracy by comparing them with the new existing optimal fourth-order methods available in the literature. It is found that they are very useful in high precision computations. Further, the dynamic study of these methods also supports the theoretical aspect.

Keywords: basins of attraction, nonlinear equations, simple roots, Newton's method

Procedia PDF Downloads 289
699 Pesticide Residue Determination on Cumin Plant (Nigella orientalis L.) Grown through Agricultural Practices with LC-MS/MS and GC-MS

Authors: Nilda Ersoy, Sevinç Şener, Ayşe Yalçın Elidemir, Ebru Evcil, Ergün Döğen

Abstract:

In this study, pesticide residues were investigated in black cumin (Nigella orientalis L.) seeds which grown in Turkey. GC-MS and LC-MS/MS analytical instruments are used in high precision, when determining residue limits. A total of 100 pesticide active ingredients in LC-MS/MS devices have been performed in Nigella orientalis L. seeds samples. Moreover, for same aim, 103 pesticide active ingredients were analyzed in GC-MS. This study conducted in 2012 and 2013. Samples residues were not found in detectable levels for two years.

Keywords: pesticide, residue, black cumin, Nigella orientalis L.

Procedia PDF Downloads 310
698 Interpretation of the Russia-Ukraine 2022 War via N-Gram Analysis

Authors: Elcin Timur Cakmak, Ayse Oguzlar

Abstract:

This study presents the results of the tweets sent by Twitter users on social media about the Russia-Ukraine war by bigram and trigram methods. On February 24, 2022, Russian President Vladimir Putin declared a military operation against Ukraine, and all eyes were turned to this war. Many people living in Russia and Ukraine reacted to this war and protested and also expressed their deep concern about this war as they felt the safety of their families and their futures were at stake. Most people, especially those living in Russia and Ukraine, express their views on the war in different ways. The most popular way to do this is through social media. Many people prefer to convey their feelings using Twitter, one of the most frequently used social media tools. Since the beginning of the war, it is seen that there have been thousands of tweets about the war from many countries of the world on Twitter. These tweets accumulated in data sources are extracted using various codes for analysis through Twitter API and analysed by Python programming language. The aim of the study is to find the word sequences in these tweets by the n-gram method, which is known for its widespread use in computational linguistics and natural language processing. The tweet language used in the study is English. The data set consists of the data obtained from Twitter between February 24, 2022, and April 24, 2022. The tweets obtained from Twitter using the #ukraine, #russia, #war, #putin, #zelensky hashtags together were captured as raw data, and the remaining tweets were included in the analysis stage after they were cleaned through the preprocessing stage. In the data analysis part, the sentiments are found to present what people send as a message about the war on Twitter. Regarding this, negative messages make up the majority of all the tweets as a ratio of %63,6. Furthermore, the most frequently used bigram and trigram word groups are found. Regarding the results, the most frequently used word groups are “he, is”, “I, do”, “I, am” for bigrams. Also, the most frequently used word groups are “I, do, not”, “I, am, not”, “I, can, not” for trigrams. In the machine learning phase, the accuracy of classifications is measured by Classification and Regression Trees (CART) and Naïve Bayes (NB) algorithms. The algorithms are used separately for bigrams and trigrams. We gained the highest accuracy and F-measure values by the NB algorithm and the highest precision and recall values by the CART algorithm for bigrams. On the other hand, the highest values for accuracy, precision, and F-measure values are achieved by the CART algorithm, and the highest value for the recall is gained by NB for trigrams.

Keywords: classification algorithms, machine learning, sentiment analysis, Twitter

Procedia PDF Downloads 52
697 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 457
696 Marine Propeller Cavitation Analysis Using BEM

Authors: Ehsan Yari

Abstract:

In this paper, a numerical study of sheet cavitation has been performed on DTMB4119 and E779A marine propellers with the boundary element method. In propeller design, various parameters of geometry and fluid are incorporated. So a program is needed to solve the flow taking the whole parameters changing into account. The capability of analyzing the wetted and cavitation flow around propellers in steady, unsteady, uniform, and non-uniform conditions while decreasing computational time compared to numerical finite volume methods with acceptable precision are the characteristic features of the present method. Moreover, modifying the position of the detachment point and its corresponding potential value has been considered. Numerical results have been validated with experimental data, showing a good conformation.

Keywords: cavitation, BEM, DTMB4119, E779A

Procedia PDF Downloads 21
695 A Compressor Map Optimizing Tool for Prediction of Compressor Off-Design Performance

Authors: Zhongzhi Hu, Jie Shen, Jiqiang Wang

Abstract:

A high precision aeroengine model is needed when developing the engine control system. Compared with other main components, the axial compressor is the most challenging component to simulate. In this paper, a compressor map optimizing tool based on the introduction of a modifiable β function is developed for FWorks (FADEC Works). Three parameters (d density, f fitting coefficient, k₀ slope of the line β=0) are introduced to the β function to make it modifiable. The comparison of the traditional β function and the modifiable β function is carried out for a certain type of compressor. The interpolation errors show that both methods meet the modeling requirements, while the modifiable β function can predict compressor performance more accurately for some areas of the compressor map where the users are interested in.

Keywords: beta function, compressor map, interpolation error, map optimization tool

Procedia PDF Downloads 234
694 Reduced Complexity Iterative Solution For I/Q Imbalance Problem in DVB-T2 Systems

Authors: Karim S. Hassan, Hisham M. Hamed, Yassmine A. Fahmy, Ahmed F. Shalash

Abstract:

The mismatch between in-phase and quadrature signals in Orthogonal frequency division multiplexing (OFDM) systems, such as DVB-T2, results in a severe degradation in performance. Several general solutions have been proposed in the past, but these are largely computationally intensive, leading to complex implementations. In this paper, we propose a relatively simple iterative solution, which provides good results in relatively few iterations, using fixed precision arithmetic. An additional advantage is that complex digital blocks, such as dividers and square root, are not required. Thus, the proposed solution may be implemented in relatively simple hardware.

Keywords: OFDM, DVB-T2, I/Q imbalance, I/Q mismatch, iterative method, fixed point, reduced complexity

Procedia PDF Downloads 507
693 Stabilized Earth Roads Construction and Its Challenges

Authors: Mokhtar Nikgoo

Abstract:

Road definition and road construction: in engineering literature, a road is defined as a means of communication between two different places by air, land, and sea. In this way, all sea, land, and air routes are considered as roads. Road construction is an operation to create a road on the ground between 2 points with a specified width, which includes works such as subgrade, paving, placing tables, and traffic signs on the road. In this article, the stages of road construction are explained from the beginning to the end. Road construction is generally done in the construction of rural, urban, and inter-city roads, and according to the special conditions of this area, the precision of engineers in its design and calculations is very important. For example, if the design of a road does not pay enough attention to the way the road curves, there will undoubtedly be countless accidents. Also, adjusting the road surface and its durability and uniformity are among the things that engineers solve according to the upcoming obstacles.

Keywords: road construction, surveying, freeway, pavement, excavator

Procedia PDF Downloads 45
692 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 54
691 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method

Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi

Abstract:

Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.

Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus

Procedia PDF Downloads 361
690 On the Analysis of Pseudorandom Partial Quotient Sequences Generated from Continued Fractions

Authors: T. Padma, Jayashree S. Pillai

Abstract:

Random entities are an essential component in any cryptographic application. The suitability of a number theory based novel pseudorandom sequence called Pseudorandom Partial Quotient Sequence (PPQS) generated from the continued fraction expansion of irrational numbers, in cryptographic applications, is analyzed in this paper. An approach to build the algorithm around a hard mathematical problem has been considered. The PQ sequence is tested for randomness and its suitability as a cryptographic key by performing randomness analysis, key sensitivity and key space analysis, precision analysis and evaluating the correlation properties is established.

Keywords: pseudorandom sequences, key sensitivity, correlation, security analysis, randomness analysis, sensitivity analysis

Procedia PDF Downloads 550
689 Phenotype and Psychometric Characterization of Phelan-Mcdermid Syndrome Patients

Authors: C. Bel, J. Nevado, F. Ciceri, M. Ropacki, T. Hoffmann, P. Lapunzina, C. Buesa

Abstract:

Background: The Phelan-McDermid syndrome (PMS) is a genetic disorder caused by the deletion of the terminal region of chromosome 22 or mutation of the SHANK3 gene. Shank3 disruption in mice leads to dysfunction of synaptic transmission, which can be restored by epigenetic regulation with both Lysine Specific Demethylase 1 (LSD1) inhibitors. PMS subjects result in a variable degree of intellectual disability, delay or absence of speech, autistic spectrum disorders symptoms, low muscle tone, motor delays and epilepsy. Vafidemstat is an LSD1 inhibitor in Phase II clinical development with a well-established and favorable safety profile, and data supporting the restoration of memory and cognition defects as well as reduction of agitation and aggression in several animal models and clinical studies. Therefore, vafidemstat has the potential to become a first-in-class precision medicine approach to treat PMS patients. Aims: The goal of this research is to perform an observational trial to psychometrically characterize individuals carrying deletions in SHANK3 and build a foundation for subsequent precision psychiatry clinical trials with vafidemstat. Methodology: This study is characterizing the clinical profile of 20 to 40 subjects, > 16-year-old, with genotypically confirmed PMS diagnosis. Subjects will complete a battery of neuropsychological scales, including the Repetitive Behavior Questionnaire (RBQ), Vineland Adaptive Behavior Scales, Escala de Observación para el Diagnostico del Autismo (Autism Diagnostic Observational Scale) (ADOS)-2, the Battelle Developmental Inventory and the Behavior Problems Inventory (BPI). Results: By March 2021, 19 patients have been enrolled. Unsupervised hierarchical clustering of the results obtained so far identifies 3 groups of patients, characterized by different profiles of cognitive and behavioral scores. The first cluster is characterized by low Battelle age, high ADOS and low Vineland, RBQ and BPI scores. Low Vineland, RBQ and BPI scores are also detected in the second cluster, which in contrast has high Battelle age and low ADOS scores. The third cluster is somewhat in the middle for the Battelle, Vineland and ADOS scores while displaying the highest levels of aggression (high BPI) and repeated behaviors (high RBQ). In line with the observation that female patients are generally affected by milder forms of autistic symptoms, no male patients are present in the second cluster. Dividing the results by gender highlights that male patients in the third cluster are characterized by a higher frequency of aggression, whereas female patients from the same cluster display a tendency toward higher repetitive behavior. Finally, statistically significant differences in deletion sizes are detected comparing the three clusters (also after correcting for gender), and deletion size appears to be positively correlated with ADOS and negatively correlated with Vineland A and C scores. No correlation is detected between deletion size and the BPI and RBQ scores. Conclusions: Precision medicine may open a new way to understand and treat Central Nervous System disorders. Epigenetic dysregulation has been proposed to be an important mechanism in the pathogenesis of schizophrenia and autism. Vafidemstat holds exciting therapeutic potential in PMS, and this study will provide data regarding the optimal endpoints for a future clinical study to explore vafidemstat ability to treat shank3-associated psychiatric disorders.

Keywords: autism, epigenetics, LSD1, personalized medicine

Procedia PDF Downloads 136
688 Characterizing and Developing the Clinical Grade Microbiome Assay with a Robust Bioinformatics Pipeline for Supporting Precision Medicine Driven Clinical Development

Authors: Danyi Wang, Andrew Schriefer, Dennis O'Rourke, Brajendra Kumar, Yang Liu, Fei Zhong, Juergen Scheuenpflug, Zheng Feng

Abstract:

Purpose: It has been recognized that the microbiome plays critical roles in disease pathogenesis, including cancer, autoimmune disease, and multiple sclerosis. To develop a clinical-grade assay for exploring microbiome-derived clinical biomarkers across disease areas, a two-phase approach is implemented. 1) Identification of the optimal sample preparation reagents using pre-mixed bacteria and healthy donor stool samples coupled with proprietary Sigma-Aldrich® bioinformatics solution. 2) Exploratory analysis of patient samples for enabling precision medicine. Study Procedure: In phase 1 study, we first compared the 16S sequencing results of two ATCC® microbiome standards (MSA 2002 and MSA 2003) across five different extraction kits (Kit A, B, C, D & E). Both microbiome standards samples were extracted in triplicate across all extraction kits. Following isolation, DNA quantity was determined by Qubit assay. DNA quality was assessed to determine purity and to confirm extracted DNA is of high molecular weight. Bacterial 16S ribosomal ribonucleic acid (rRNA) amplicons were generated via amplification of the V3/V4 hypervariable region of the 16S rRNA. Sequencing was performed using a 2x300 bp paired-end configuration on the Illumina MiSeq. Fastq files were analyzed using the Sigma-Aldrich® Microbiome Platform. The Microbiome Platform is a cloud-based service that offers best-in-class 16S-seq and WGS analysis pipelines and databases. The Platform and its methods have been extensively benchmarked using microbiome standards generated internally by MilliporeSigma and other external providers. Data Summary: The DNA yield using the extraction kit D and E is below the limit of detection (100 pg/µl) of Qubit assay as both extraction kits are intended for samples with low bacterial counts. The pre-mixed bacterial pellets at high concentrations with an input of 2 x106 cells for MSA-2002 and 1 x106 cells from MSA-2003 were not compatible with the kits. Among the remaining 3 extraction kits, kit A produced the greatest yield whereas kit B provided the least yield (Kit-A/MSA-2002: 174.25 ± 34.98; Kit-A/MSA-2003: 179.89 ± 30.18; Kit-B/MSA-2002: 27.86 ± 9.35; Kit-B/MSA-2003: 23.14 ± 6.39; Kit-C/MSA-2002: 55.19 ± 10.18; Kit-C/MSA-2003: 35.80 ± 11.41 (Mean ± SD)). Also, kit A produced the greatest yield, whereas kit B provided the least yield. The PCoA 3D visualization of the Weighted Unifrac beta diversity shows that kits A and C cluster closely together while kit B appears as an outlier. The kit A sequencing samples cluster more closely together than both the other kits. The taxonomic profiles of kit B have lower recall when compared to the known mixture profiles indicating that kit B was inefficient at detecting some of the bacteria. Conclusion: Our data demonstrated that the DNA extraction method impacts DNA concentration, purity, and microbial communities detected by next-generation sequencing analysis. Further microbiome analysis performance comparison of using healthy stool samples is underway; also, colorectal cancer patients' samples will be acquired for further explore the clinical utilities. Collectively, our comprehensive qualification approach, including the evaluation of optimal DNA extraction conditions, the inclusion of positive controls, and the implementation of a robust qualified bioinformatics pipeline, assures accurate characterization of the microbiota in a complex matrix for deciphering the deep biology and enabling precision medicine.

Keywords: 16S rRNA sequencing, analytical validation, bioinformatics pipeline, metagenomics

Procedia PDF Downloads 129
687 A Method to Predict the Thermo-Elastic Behavior of Laser-Integrated Machine Tools

Authors: C. Brecher, M. Fey, F. Du Bois-Reymond, S. Neus

Abstract:

Additive manufacturing has emerged into a fast-growing section within the manufacturing technologies. Established machine tool manufacturers, such as DMG MORI, recently presented machine tools combining milling and laser welding. By this, machine tools can realize a higher degree of flexibility and a shorter production time. Still there are challenges that have to be accounted for in terms of maintaining the necessary machining accuracy - especially due to thermal effects arising through the use of high power laser processing units. To study the thermal behavior of laser-integrated machine tools, it is essential to analyze and simulate the thermal behavior of machine components, individual and assembled. This information will help to design a geometrically stable machine tool under the influence of high power laser processes. This paper presents an approach to decrease the loss of machining precision due to thermal impacts. Real effects of laser machining processes are considered and thus enable an optimized design of the machine tool, respective its components, in the early design phase. Core element of this approach is a matched FEM model considering all relevant variables arising, e.g. laser power, angle of laser beam, reflective coefficients and heat transfer coefficient. Hence, a systematic approach to obtain this matched FEM model is essential. Indicating the thermal behavior of structural components as well as predicting the laser beam path, to determine the relevant beam intensity on the structural components, there are the two constituent aspects of the method. To match the model both aspects of the method have to be combined and verified empirically. In this context, an essential machine component of a five axis machine tool, the turn-swivel table, serves as the demonstration object for the verification process. Therefore, a turn-swivel table test bench as well as an experimental set-up to measure the beam propagation were developed and are described in the paper. In addition to the empirical investigation, a simulative approach of the described types of experimental examination is presented. Concluding, it is shown that the method and a good understanding of the two core aspects, the thermo-elastic machine behavior and the laser beam path, as well as their combination helps designers to minimize the loss of precision in the early stages of the design phase.

Keywords: additive manufacturing, laser beam machining, machine tool, thermal effects

Procedia PDF Downloads 241
686 Optimize Data Evaluation Metrics for Fraud Detection Using Machine Learning

Authors: Jennifer Leach, Umashanger Thayasivam

Abstract:

The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, though, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate people. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease this advancement. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent data, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which testing split and technique would lead to the most optimal results.

Keywords: data science, fraud detection, machine learning, supervised learning

Procedia PDF Downloads 157
685 The Modification of Convolutional Neural Network in Fin Whale Identification

Authors: Jiahao Cui

Abstract:

In the past centuries, due to climate change and intense whaling, the global whale population has dramatically declined. Among the various whale species, the fin whale experienced the most drastic drop in number due to its popularity in whaling. Under this background, identifying fin whale calls could be immensely beneficial to the preservation of the species. This paper uses feature extraction to process the input audio signal, then a network based on AlexNet and three networks based on the ResNet model was constructed to classify fin whale calls. A mixture of the DOSITS database and the Watkins database was used during training. The results demonstrate that a modified ResNet network has the best performance considering precision and network complexity.

Keywords: convolutional neural network, ResNet, AlexNet, fin whale preservation, feature extraction

Procedia PDF Downloads 88
684 Opto-Mechanical Characterization of Aspheric Lenses from the Hybrid Method

Authors: Aliouane Toufik, Hamdi Amine, Bouzid Djamel

Abstract:

Aspheric optical components are an alternative to the use of conventional lenses in the implementation of imaging systems for the visible range. Spherical lenses are capable of producing aberrations. Therefore, they are not able to focus all the light into a single point. Instead, aspherical lenses correct aberrations and provide better resolution even with compact lenses incorporating a small number of lenses. Metrology of these components is very difficult especially when the resolution requirements increase and insufficient or complexity of conventional tools requires the development of specific approaches to characterization. This work is part of the problem existed because the objectives are the study and comparison of different methods used to measure surface rays hybrid aspherical lenses.

Keywords: manufacture of lenses, aspherical surface, precision molding, radius of curvature, roughness

Procedia PDF Downloads 442
683 Design to Cryogenic System for Dilution Refrigerator with Cavity and Superconducting Magnet

Authors: Ki Woong Lee

Abstract:

The Center for Axion and Precision Physics Research is studying the search for dark matter using 12 tesla superconducting magnets. A dilution refrigerator is being used for search experiments, and superconducting magnets, superconducting cavities. The dilution refrigerator requires a stable cryogenic environment using liquid helium. Accordingly, a cryogenic system for a stable supply of liquid helium is to be established. This cryogenic system includes the liquefying, supply, storage, and purification of liquid helium. This article presents the basic design, construction, and operation plans for building cryogenic systems.

Keywords: cryogenic system, dilution refrigerator, superconducting magnet, helium recovery system

Procedia PDF Downloads 94
682 Study the Effect of Friction on Barreling Behavior during Upsetting Process Using Anand Model

Authors: H. Mohammadi Majd, M. Jalali Azizpour, V. Tavaf, A. Jaderi

Abstract:

In upsetting processes contact friction significantly influence metal flow, stress-strain state and process parameters. Furthermore, tribological conditions influence workpiece deformation and its dimensional precision. A viscoplastic constitutive law, the Anand model, was applied to represent the inelastic deformation behavior in upsetting process. This paper presents research results of the influence of contact friction coefficient on a workpiece deformation in upsetting process.finite element parameters. This technique was tested for three different specimens simulations of the upsetting and the corresponding material and can be successfully employed to predict the deformation of the upsetting process.

Keywords: friction, upsetting, barreling, Anand model

Procedia PDF Downloads 305