Search results for: dimensional accuracy
3937 Spatial Cognition and 3-Dimensional Vertical Urban Design Guidelines
Authors: Hee Sun (Sunny) Choi, Gerhard Bruyns, Wang Zhang, Sky Cheng, Saijal Sharma
Abstract:
The main focus of this paper is to propose a comprehensive framework for the cognitive measurement and modelling of the built environment. This will involve exploring and measuring neural mechanisms. The aim is to create a foundation for further studies in this field that are consistent and rigorous. Additionally, this framework will facilitate collaboration with cognitive neuroscientists by establishing a shared conceptual basis. The goal of this research is to develop a human-centric approach for urban design that is scientific and measurable, producing a set of urban design guidelines that incorporate cognitive measurement and modelling. By doing so, the broader intention is to design urban spaces that prioritize human needs and well-being, making them more liveable.Keywords: vertical urbanism, human centric design, spatial cognition and psychology, vertical urban design guidelines
Procedia PDF Downloads 873936 Current of Drain for Various Values of Mobility in the Gaas Mesfet
Authors: S. Belhour, A. K. Ferouani, C. Azizi
Abstract:
In recent years, a considerable effort (experience, numerical simulation, and theoretical prediction models) has characterised by high efficiency and low cost. Then an improved physics analytical model for simulating is proposed. The performance of GaAs MESFETs has been developed for use in device design for high frequency. This model is based on mathematical analysis, and a new approach for the standard model is proposed, this approach allowed to conceive applicable model for MESFET’s operating in the turn-one or pinch-off region and valid for the short-channel and the long channel MESFET’s in which the two dimensional potential distribution contributed by the depletion layer under the gate is obtained by conventional approximation. More ever, comparisons between the analytical models with different values of mobility are proposed, and a good agreement is obtained.Keywords: analytical, gallium arsenide, MESFET, mobility, models
Procedia PDF Downloads 793935 1D Klein-Gordon Equation in an Infinite Square Well with PT Symmetry Boundary Conditions
Authors: Suleiman Bashir Adamu, Lawan Sani Taura
Abstract:
We study the role of boundary conditions via -symmetric quantum mechanics, where denotes parity operator and denotes time reversal operator. Using the one-dimensional Schrödinger Hamiltonian for a free particle in an infinite square well, we introduce symmetric boundary conditions. We find solutions of the 1D Klein-Gordon equation for a free particle in an infinite square well with Hermitian boundary and symmetry boundary conditions, where in both cases the energy eigenvalues and eigenfunction, respectively, are obtained.Keywords: Eigenvalues, Eigenfunction, Hamiltonian, Klein- Gordon equation, PT-symmetric quantum mechanics
Procedia PDF Downloads 3843934 Effect of Aggregate Size on Mechanical Behavior of Passively Confined Concrete Subjected to 3D Loading
Authors: Ibrahim Ajani Tijani, C. W. Lim
Abstract:
Limited studies have examined the effect of size on the mechanical behavior of confined concrete subjected to 3-dimensional (3D) test. With the novel 3D testing system to produce passive confinement, concrete cubes were tested to examine the effect of size on stress-strain behavior of the specimens. The effect of size on 3D stress-strain relationship was scrutinized and compared to the stress-strain relationship available in the literature. It was observed that the ultimate stress and the corresponding strain was related to the confining rigidity and size. The size shows a significant effect on the intersection stress and a new model was proposed for the intersection stress based on the conceptual design of the confining plates.Keywords: concrete, aggregate size, size effect, 3D compression, passive confinement
Procedia PDF Downloads 2143933 A User Identification Technique to Access Big Data Using Cloud Services
Authors: A. R. Manu, V. K. Agrawal, K. N. Balasubramanya Murthy
Abstract:
Authentication is required in stored database systems so that only authorized users can access the data and related cloud infrastructures. This paper proposes an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. The proposed technique is likely to be more robust as the probability of breaking the password is extremely low. This framework uses a multi-modal biometric approach and SMS to enforce additional security measures with the conventional Login/password system. The robustness of the technique is demonstrated mathematically using a statistical analysis. This work presents the authentication system along with the user authentication architecture diagram, activity diagrams, data flow diagrams, sequence diagrams, and algorithms.Keywords: design, implementation algorithms, performance, biometric approach
Procedia PDF Downloads 4813932 Modification of Escherichia coli PtolT Expression Vector via Site-Directed Mutagenesis
Authors: Yakup Ulusu, Numan Eczacıoğlu, İsa Gökçe, Helen Waller, Jeremy H. Lakey
Abstract:
Besides having the appropriate amino acid sequence to perform the function of proteins, it is important to have correct conformation after this sequence to process. To consist of this conformation depends on the amino acid sequence at the primary structure, hydrophobic interaction, chaperones and enzymes in charge of folding etc. Misfolded proteins are not functional and tend to be aggregated. Cysteine originating disulfide cross-links make stable this conformation of functional proteins. When two of the cysteine amino acids come side by side, disulfide bond is established that forms a cystine bridge. Due to this feature cysteine plays an important role on the formation of three-dimensional structure of many proteins. There are two cysteine amino acids (C44, C69) in the Tol-A-III protein. Unlike protein disulfide bonds from within his own, any non-specific cystine bridge causes a change in the three dimensional structure of the protein. Proteins can be expressed in various host cells as directly or fusion (chimeric). As a result of overproduction of the recombinant proteins, aggregation of insoluble proteins in the host cell can occur by forming a crystal structure called inclusion body. In general fusion proteins are produced for provide affinity tags to make proteins more soluble and production of some toxic proteins via fusion protein expression system like pTolT. Proteins can be modified by using a site-directed mutagenesis. By this way, creation of non-specific disulfide crosslinks can be prevented at fusion protein expression system via the present cysteine replaced by another amino acid such as serine, glycine or etc. To do this, we need; a DNA molecule that contains the gene that encodes for the target protein, required primers for mutation to be designed according to site directed mutagenesis reaction. This study was aimed to be replaced cysteine encoding codon TGT with serine encoding codon AGT. For this sense and reverse primers designed (given below) and used site-directed mutagenesis reaction. Several new copy of the template plasmid DNA has been formed with above mentioned mutagenic primers via polymerase chain reaction (PCR). PCR product consists of both the master template DNA (wild type) and the new DNA sequences containing mutations. Dpn-l endonuclease restriction enzyme which is specific for methylated DNA and cuts them to the elimination of the master template DNA. E. coli cells obtained after transformation were incubated LB medium with antibiotic. After purification of plasmid DNA from E. coli, the presence of the mutation was determined by DNA sequence analysis. Developed this new plasmid is called PtolT-δ.Keywords: site directed mutagenesis, Escherichia coli, pTolT, protein expression
Procedia PDF Downloads 3783931 The Employment of Unmanned Aircraft Systems for Identification and Classification of Helicopter Landing Zones and Airdrop Zones in Calamity Situations
Authors: Marielcio Lacerda, Angelo Paulino, Elcio Shiguemori, Alvaro Damiao, Lamartine Guimaraes, Camila Anjos
Abstract:
Accurate information about the terrain is extremely important in disaster management activities or conflict. This paper proposes the use of the Unmanned Aircraft Systems (UAS) at the identification of Airdrop Zones (AZs) and Helicopter Landing Zones (HLZs). In this paper we consider the AZs the zones where troops or supplies are dropped by parachute, and HLZs areas where victims can be rescued. The use of digital image processing enables the automatic generation of an orthorectified mosaic and an actual Digital Surface Model (DSM). This methodology allows obtaining this fundamental information to the terrain’s comprehension post-disaster in a short amount of time and with good accuracy. In order to get the identification and classification of AZs and HLZs images from DJI drone, model Phantom 4 have been used. The images were obtained with the knowledge and authorization of the responsible sectors and were duly registered in the control agencies. The flight was performed on May 24, 2017, and approximately 1,300 images were obtained during approximately 1 hour of flight. Afterward, new attributes were generated by Feature Extraction (FE) from the original images. The use of multispectral images and complementary attributes generated independently from them increases the accuracy of classification. The attributes of this work include the Declivity Map and Principal Component Analysis (PCA). For the classification four distinct classes were considered: HLZ 1 – small size (18m x 18m); HLZ 2 – medium size (23m x 23m); HLZ 3 – large size (28m x 28m); AZ (100m x 100m). The Decision Tree method Random Forest (RF) was used in this work. RF is a classification method that uses a large collection of de-correlated decision trees. Different random sets of samples are used as sampled objects. The results of classification from each tree and for each object is called a class vote. The resulting classification is decided by a majority of class votes. In this case, we used 200 trees for the execution of RF in the software WEKA 3.8. The classification result was visualized on QGIS Desktop 2.12.3. Through the methodology used, it was possible to classify in the study area: 6 areas as HLZ 1, 6 areas as HLZ 2, 4 areas as HLZ 3; and 2 areas as AZ. It should be noted that an area classified as AZ covers the classifications of the other classes, and may be used as AZ, HLZ of large size (HLZ3), medium size (HLZ2) and small size helicopters (HLZ1). Likewise, an area classified as HLZ for large rotary wing aircraft (HLZ3) covers the smaller area classifications, and so on. It was concluded that images obtained through small UAV are of great use in calamity situations since they can provide data with high accuracy, with low cost, low risk and ease and agility in obtaining aerial photographs. This allows the generation, in a short time, of information about the features of the terrain in order to serve as an important decision support tool.Keywords: disaster management, unmanned aircraft systems, helicopter landing zones, airdrop zones, random forest
Procedia PDF Downloads 1783930 Development of Multi-Leaf Collimator-Based Isocenter Verification Tool Using Electrical Portal Imaging Device for Stereotactic Radiosurgery
Authors: Panatda Intanin, Sangutid Thongsawad, Chirapha Tannanonta, Todsaporn Fuangrod
Abstract:
Stereotactic radiosurgery (SRS) is a highly precision delivery technique that requires comprehensive quality assurance (QA) tests prior to treatment delivery. An isocenter of delivery beam plays a critical role that affect the treatment accuracy. The uncertainty of isocenter is traditionally accessed using circular cone equipment, Winston-Lutz (WL) phantom and film. This technique is considered time consuming and highly dependent on the observer. In this work, the development of multileaf collimator (MLC)-based isocenter verification tool using electronic portal imaging device (EPID) was proposed and evaluated. A mechanical isocenter alignment with ball bearing diameter 5 mm and circular cone diameter 10 mm fixed to gantry head defines the radiation field was set as the conventional WL test method. The conventional setup was to compare to the proposed setup; using MLC (10 x 10 mm) to define the radiation filed instead of cone. This represents more realistic delivery field than using circular cone equipment. The acquisition from electronic portal imaging device (EPID) and radiographic film were performed in both experiments. The gantry angles were set as following: 0°, 90°, 180° and 270°. A software tool was in-house developed using MATLAB/SIMULINK programming to determine the centroid of radiation field and shadow of WL phantom automatically. This presents higher accuracy than manual measurement. The deviation between centroid of both cone-based and MLC-based WL tests were quantified. To compare between film and EPID image, the deviation for all gantry angle was 0.26±0.19mm and 0.43±0.30 for cone-based and MLC-based WL tests. For the absolute deviation calculation on EPID images between cone and MLC-based WL test was 0.59±0.28 mm and the absolute deviation on film images was 0.14±0.13 mm. Therefore, the MLC-based isocenter verification using EPID present high sensitivity tool for SRS QA.Keywords: isocenter verification, quality assurance, EPID, SRS
Procedia PDF Downloads 1563929 Tracy: A Java Library to Render a 3D Graphical Human Model
Authors: Sina Saadati, Mohammadreza Razzazi
Abstract:
Since Java is an object-oriented language, It can be used to solve a wide range of problems. One of the considerable usages of this language can be found in Agent-based modeling and simulation. Despite the significant power of Java, There is not an easy method to render a 3-dimensional human model. In this article, we are about to develop a library which helps modelers present a 3D human model and control it with Java. The library runs two server programs. The first one is a web page server that can connect to any browser and present an HTML code. The second server connects to the browser and controls the movement of the model. So, the modeler will be able to develop a simulation and display a good-looking human model without any knowledge of any graphical tools.Keywords: agent-based modeling and simulation, human model, graphics, Java, distributed systems
Procedia PDF Downloads 1183928 Unsteady Temperature Distribution in a Finite Functionally Graded Cylinder
Authors: A. Amiri Delouei
Abstract:
In the current study, two-dimensional unsteady heat conduction in a functionally graded cylinder is studied analytically. The temperature distribution is in radial and longitudinal directions. Heat conduction coefficients are considered a power function of radius both in radial and longitudinal directions. The proposed solution can exactly satisfy the boundary conditions. Analytical unsteady temperature distribution for different parameters of functionally graded cylinder is investigated. The achieved exact solution is useful for thermal stress analysis of functionally graded cylinders. Regarding the analytical approach, this solution can be used to understand the concepts of heat conduction in functionally graded materials.Keywords: functionally graded materials, unsteady heat conduction, cylinder, temperature distribution
Procedia PDF Downloads 3033927 Adomian’s Decomposition Method to Functionally Graded Thermoelastic Materials with Power Law
Authors: Hamdy M. Youssef, Eman A. Al-Lehaibi
Abstract:
This paper presents an iteration method for the numerical solutions of a one-dimensional problem of generalized thermoelasticity with one relaxation time under given initial and boundary conditions. The thermoelastic material with variable properties as a power functional graded has been considered. Adomian’s decomposition techniques have been applied to the governing equations. The numerical results have been calculated by using the iterations method with a certain algorithm. The numerical results have been represented in figures, and the figures affirm that Adomian’s decomposition method is a successful method for modeling thermoelastic problems. Moreover, the empirical parameter of the functional graded, and the lattice design parameter have significant effects on the temperature increment, the strain, the stress, the displacement.Keywords: Adomian, decomposition method, generalized thermoelasticity, algorithm
Procedia PDF Downloads 1473926 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser
Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett
Abstract:
Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser
Procedia PDF Downloads 1603925 Obtain the Stress Intensity Factor (SIF) in a Medium Containing a Penny-Shaped Crack by the Ritz Method
Authors: A. Tavangari, N. Salehzadeh
Abstract:
In the crack growth analysis, the Stress Intensity Factor (SIF) is a fundamental prerequisite. In the present study, the mode I stress intensity factor (SIF) of three-dimensional penny-Shaped crack is obtained in an isotropic elastic cylindrical medium with arbitrary dimensions under arbitrary loading at the top of the cylinder, by the semi-analytical method based on the Rayleigh-Ritz method. This method that is based on minimizing the potential energy amount of the whole of the system, gives a very close results to the previous studies. Defining the displacements (elastic fields) by hypothetical functions in a defined coordinate system is the base of this research. So for creating the singularity conditions at the tip of the crack the appropriate terms should be found.Keywords: penny-shaped crack, stress intensity factor, fracture mechanics, Ritz method
Procedia PDF Downloads 3683924 A Study Regarding Nanotechnologies as a Vector of New European Business Model
Authors: Adriana Radan Ungureanu
Abstract:
The industrial landscape is changing due to the financial crises, poor availability of raw materials, new discoveries and interdisciplinary collaborations. New ideas shape the change through technologies and bring responses for a better life. The process of change is leaded by big players like states and companies, but they cannot keep their places on the market without the help of the small ones. The main tool of change is technology and the entire developed world dedicated efforts for decades in this direction. Even the expectations are not yet met, the research for finding adequate solutions is far from to be stopped. A relevant example is nanotechnology where most of discoveries still remain into laboratory and could not succeed to find the right way to the market. In front of this situation the right question could be: ”Is it worth investing in nanotechnology in the name of an uncertain future but with very little impact on present?” This paper tries to find a positive answer from a three-dimensional approach using a descriptive analyse based on available database supplied by the European case studies, reports, and literature.Keywords: Europe, KET’s, nanotechnology, technology
Procedia PDF Downloads 4203923 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1353922 Heat Distribution Simulation on Transformer Using FEMM Software
Authors: N. K. Mohd Affendi, T. A. R. Tuan Abdullah, S. A. Syed Mustaffa
Abstract:
In power industry transformer is an important component and most of us familiar by the functioning principle of a transformer electrically. There are many losses occur during the operation of a transformer that causes heat generation. This heat, if not dissipated properly will reduce the lifetime and effectiveness of the transformer. Transformer cooling helps in maintaining the temperature rise of various paths. This paper proposed to minimize the ambient temperature of the transformer room in order to lower down the temperature of the transformer. A simulation has been made using finite element methods programs called FEMM (Finite Elements Method Magnetics) to create a virtual model based on actual measurement of a transformer. The generalization of the two-dimensional (2D) FEMM results proves that by minimizing the ambient temperature, the heat of the transformer is decreased. The modeling process and of the transformer heat flow has been presented.Keywords: heat generation, temperature rise, ambient temperature, FEMM
Procedia PDF Downloads 4083921 Contribution to the Evaluation of Uncertainties of Measurement to the Data Processing Sequences of a Cmm
Authors: Hassina Gheribi, Salim Boukebbab
Abstract:
The measurement of the parts manufactured on CMM (coordinate measuring machine) is based on the association of a surface of perfect geometry to the group of dots palpated via a mathematical calculation of the distances between the palpated points and itself surfaces. Surfaces not being never perfect, they are measured by a number of points higher than the minimal number necessary to define them mathematically. However, the central problems of three-dimensional metrology are the estimate of, the orientation parameters, location and intrinsic of this surface. Including the numerical uncertainties attached to these parameters help the metrologist to make decisions to be able to declare the conformity of the part to specifications fixed on the design drawing. During this paper, we will present a data-processing model in Visual Basic-6 which makes it possible automatically to determine the whole of these parameters, and their uncertainties.Keywords: coordinate measuring machines (CMM), associated surface, uncertainties of measurement, acquisition and modeling
Procedia PDF Downloads 3313920 Study on Accurate Calculation Method of Model Attidude on Wind Tunnel Test
Authors: Jinjun Jiang, Lianzhong Chen, Rui Xu
Abstract:
The accurate of model attitude angel plays an important role on the aerodynamic test results in the wind tunnel test. The original method applies the spherical coordinate system transformation to obtain attitude angel calculation.The model attitude angel is obtained by coordinate transformation and spherical surface mapping applying the nominal attitude angel (the balance attitude angel in the wind tunnel coordinate system) indicated by the mechanism. First, the coordinate transformation of this method is not only complex but also difficult to establish the transformed relationship between the space coordinate systems especially after many steps of coordinate transformation, moreover it cannot realize the iterative calculation of the interference relationship between attitude angels; Second, during the calculate process to solve the problem the arc is approximately used to replace the straight line, the angel for the tangent value, and the inverse trigonometric function is applied. Therefore, in the calculation of attitude angel, the process is complex and inaccurate, which can be solved approximately when calculating small attack angel. However, with the advancing development of modern aerodynamic unsteady research, the aircraft tends to develop high or super large attack angel and unsteadyresearch field.According to engineering practice and vector theory, the concept of vector angel coordinate systemis proposed for the first time, and the vector angel coordinate system of attitude angel is established.With the iterative correction calculation and avoiding the problem of approximate and inverse trigonometric function solution, the model attitude calculation process is carried out in detail, which validates that the calculation accuracy and accuracy of model attitude angels are improved.Based on engineering and theoretical methods, a vector angel coordinate systemis established for the first time, which gives the transformation and angel definition relations between different flight attitude coordinate systems, that can accurately calculate the attitude angel of the corresponding coordinate systemand determine its direction, especially in the channel coupling calculation, the calculation of the attitude angel between the coordinate systems is only related to the angel, and has nothing to do with the change order s of the coordinate system, whichsimplifies the calculation process.Keywords: attitude angel, angel vector coordinate system, iterative calculation, spherical coordinate system, wind tunnel test
Procedia PDF Downloads 1583919 Stochastic Default Risk Estimation Evidence from the South African Financial Market
Authors: Mesias Alfeus, Kirsty Fitzhenry, Alessia Lederer
Abstract:
The present paper provides empirical studies to estimate defaultable bonds in the South African financial market. The main goal is to estimate the unobservable factors affecting bond yields for South African major banks. The maximum likelihood approach is adopted for the estimation methodology. Extended Kalman filtering techniques are employed in order to tackle the situation that the factors cannot be observed directly. Multi-dimensional Cox-Ingersoll-Ross (CIR)-type factor models are considered. Results show that default risk increased sharply in the South African financial market during COVID-19 and the CIR model with jumps exhibits a better performance.Keywords: default intensity, unobservable state variables, CIR, α-CIR, extended kalman filtering
Procedia PDF Downloads 1173918 Heat Transfer from Block Heat Sources Mounted on the Wall of a 3-D Cabinet to Ambient Natural Convective Air Stream
Authors: J. C. Cheng, Y. L. Tsay, Z. D. Chan, C. H. Yang
Abstract:
In this study the physical system under consideration is a three-dimensional (3-D) cabinet with arrays of block heat sources mounted on one of the walls of the cabinet. The block heat sources dissipate heat to the cabinet surrounding through the conjugate conduction and natural convection. The results illustrate that the difference in hot spot temperatures of the system (θH) for the situations with and without consideration of thermal interaction is higher for smaller Rayleigh number (Ra), and can be up to 94.73% as Ra=10^5. In addition, the heat transfer characteristics depends strongly on the dimensionless heat conductivity of cabinet wall (Kwf), heat conductivity of block (Kpf) and length of cabinet (Ax). The maximum reduction in θH is 70.01% when Kwf varies from 10 to 1000, and it is 30.07% for Ax from 0.5 to 1. While the hot spot temperature of system is not sensitive to the cabinet angle (Φ).Keywords: block heat sources, 3-D cabinet, thermal interaction, heat transfer
Procedia PDF Downloads 5563917 Numerical and Experimental Investigation of Mixed-Mode Fracture of Cement Paste and Interface Under Three-Point Bending Test
Authors: S. Al Dandachli, F. Perales, Y. Monerie, F. Jamin, M. S. El Youssoufi, C. Pelissou
Abstract:
The goal of this research is to study the fracture process and mechanical behavior of concrete under I–II mixed-mode stress, which is essential for ensuring the safety of concrete structures. For this purpose, two-dimensional simulations of three-point bending tests under variable load and geometry on notched cement paste samples of composite samples (cement paste/siliceous aggregate) are modeled by employing Cohesive Zone Models (CZMs). As a result of experimental validation of these tests, the CZM model demonstrates its capacity to predict fracture propagation at the local scale.Keywords: cement paste, interface, cohesive zone model, fracture, three-point flexural test bending
Procedia PDF Downloads 1563916 The Bloom of 3D Printing in the Health Care Industry
Authors: Mihika Shivkumar, Krishna Kumar, C. Perisamy
Abstract:
3D printing is a method of manufacturing wherein materials, such as plastic or metal, are deposited in layers one on top of the other to produce a three dimensional object. 3D printing is most commonly associated with creating engineering prototypes. However, its applications in the field of human health care have been frequently disregarded. Medical applications for 3D printing are expanding rapidly and are envisaged to revolutionize health care. Medical applications for 3D printing, both present and its potential, can be categorized broadly, including: creation of customized prosthetics tissue and organ fabrication; creation of implants, and anatomical models and pharmaceutical research regarding drug dosage forms. This piece breaks down bioprinting in the healthcare sector. It focuses on the better subtle elements of every particular point, including how 3D printing functions in the present, its impediments, and future applications in the health care sector.Keywords: bio-printing, prototype, drug delivery, organ regeneration
Procedia PDF Downloads 2743915 Study of Transport Phenomena in Photonic Crystals with Correlated Disorder
Authors: Samira Cherid, Samir Bentata, Feyza Zahira Meghoufel, Yamina Sefir, Sabria Terkhi, Fatima Bendahma, Bouabdellah Bouadjemi, Ali Zitouni
Abstract:
Using the transfer-matrix technique and the Kronig Penney model, we numerically and analytically investigate the effect of short-range correlated disorder in random dimer model (RDM) on transmission properties of light in one dimension photonic crystals made of three different materials. Such systems consist of two different structures randomly distributed along the growth direction, with the additional constraint that one kind of these layers appears in pairs. It is shown that the one-dimensional random dimer photonic crystals support two types of extended modes. By shifting of the dimer resonance toward the host fundamental stationary resonance state, we demonstrate the existence of the ballistic response in these systems.Keywords: photonic crystals, disorder, correlation, transmission
Procedia PDF Downloads 4833914 Intelligent Human Pose Recognition Based on EMG Signal Analysis and Machine 3D Model
Authors: Si Chen, Quanhong Jiang
Abstract:
In the increasingly mature posture recognition technology, human movement information is widely used in sports rehabilitation, human-computer interaction, medical health, human posture assessment, and other fields today; this project uses the most original ideas; it is proposed to use the collection equipment for the collection of myoelectric data, reflect the muscle posture change on a degree of freedom through data processing, carry out data-muscle three-dimensional model joint adjustment, and realize basic pose recognition. Based on this, bionic aids or medical rehabilitation equipment can be further developed with the help of robotic arms and cutting-edge technology, which has a bright future and unlimited development space.Keywords: pose recognition, 3D animation, electromyography, machine learning, bionics
Procedia PDF Downloads 833913 CuFeOx-Based Nano-Rose Electrocatalysts for Oxygen Evolution Reaction
Authors: Hamad Almohamadi, Nabeel H. Alharthi, Abdulrahman Aljabri
Abstract:
In this study, two-dimensional CuFeOx is deposited on nickel foam for the fabrication of electrocatalyst for oxygen evolution reaction (OER). The in-situ hydrothermal synthesis of CuFeOx in presence of aloe vera extract was found to yield unique nano-rose-like morphology which aided to improve the electrochemical surface area of the electrode. The phytochemical assisted synthesis of CuFeOx using 75% aloe vera extract resulted in improved OER electrocatalytic performance by attaining the overpotential of 310 mV for 50 mA cm−2 and 410 mV for 100 mA cm−2. The electrode also sustained robust stability throughout the 50 h of chronopotentiometry studies under alkaline electrolyte conditions, thus proving to be prospective electrode material for efficient OER in electrochemical water splitting.Keywords: water splitting, phytochemicals, oxygen evaluation reaction, Tafel's slope, stability
Procedia PDF Downloads 1203912 Constructing a Semi-Supervised Model for Network Intrusion Detection
Authors: Tigabu Dagne Akal
Abstract:
While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.Keywords: intrusion detection, data mining, computer science, data mining
Procedia PDF Downloads 3003911 Role of von Willebrand Factor Antigen as Non-Invasive Biomarker for the Prediction of Portal Hypertensive Gastropathy in Patients with Liver Cirrhosis
Authors: Mohamed El Horri, Amine Mouden, Reda Messaoudi, Mohamed Chekkal, Driss Benlaldj, Malika Baghdadi, Lahcene Benmahdi, Fatima Seghier
Abstract:
Background/aim: Recently, the Von Willebrand factor antigen (vWF-Ag)has been identified as a new marker of portal hypertension (PH) and its complications. Few studies talked about its role in the prediction of esophageal varices. VWF-Ag is considered a non-invasive approach, In order to avoid the endoscopic burden, cost, drawbacks, unpleasant and repeated examinations to the patients. In our study, we aimed to evaluate the ability of this marker in the prediction of another complication of portal hypertension, which is portal hypertensive gastropathy (PHG), the one that is diagnosed also by endoscopic tools. Patients and methods: It is about a prospective study, which include 124 cirrhotic patients with no history of bleeding who underwent screening endoscopy for PH-related complications like esophageal varices (EVs) and PHG. Routine biological tests were performed as well as the VWF-Ag testing by both ELFA and Immunoturbidimetric techniques. The diagnostic performance of our marker was assessed using sensitivity, specificity, positive predictive value, negative predictive value, accuracy, and receiver operating characteristic curves. Results: 124 patients were enrolled in this study, with a mean age of 58 years [CI: 55 – 60 years] and a sex ratio of 1.17. Viral etiologies were found in 50% of patients. Screening endoscopy revealed the presence of PHG in 20.2% of cases, while for EVsthey were found in 83.1% of cases. VWF-Ag levels, were significantly increased in patients with PHG compared to those who have not: 441% [CI: 375 – 506], versus 279% [CI: 253 – 304], respectively (p <0.0001). Using the area under the receiver operating characteristic curve (AUC), vWF-Ag was a good predictor for the presence of PHG. With a value higher than 320% and an AUC of 0.824, VWF-Ag had an 84% sensitivity, 74% specificity, 44.7% positive predictive value, 94.8% negative predictive value, and 75.8% diagnostic accuracy. Conclusion: VWF-Ag is a good non-invasive low coast marker for excluding the presence of PHG in patients with liver cirrhosis. Using this marker as part of a selective screening strategy might reduce the need for endoscopic screening and the coast of the management of these kinds of patients.Keywords: von willebrand factor, portal hypertensive gastropathy, prediction, liver cirrhosis
Procedia PDF Downloads 2123910 Combining Multiscale Patterns of Weather and Sea States into a Machine Learning Classifier for Mid-Term Prediction of Extreme Rainfall in North-Western Mediterranean Sea
Authors: Pinel Sebastien, Bourrin François, De Madron Du Rieu Xavier, Ludwig Wolfgang, Arnau Pedro
Abstract:
Heavy precipitation constitutes a major meteorological threat in the western Mediterranean. Research has investigated the relationship between the states of the Mediterranean Sea and the atmosphere with the precipitation for short temporal windows. However, at a larger temporal scale, the precursor signals of heavy rainfall in the sea and atmosphere have drawn little attention. Moreover, despite ongoing improvements in numerical weather prediction, the medium-term forecasting of rainfall events remains a difficult task. Here, we aim to investigate the influence of early-spring environmental parameters on the following autumnal heavy precipitations. Hence, we develop a machine learning model to predict extreme autumnal rainfall with a 6-month lead time over the Spanish Catalan coastal area, based on i) the sea pattern (main current-LPC and Sea Surface Temperature-SST) at the mesoscale scale, ii) 4 European weather teleconnection patterns (NAO, WeMo, SCAND, MO) at synoptic scale, and iii) the hydrological regime of the main local river (Rhône River). The accuracy of the developed model classifier is evaluated via statistical analysis based on classification accuracy, logarithmic and confusion matrix by comparing with rainfall estimates from rain gauges and satellite observations (CHIRPS-2.0). Sensitivity tests are carried out by changing the model configuration, such as sea SST, sea LPC, river regime, and synoptic atmosphere configuration. The sensitivity analysis suggests a negligible influence from the hydrological regime, unlike SST, LPC, and specific teleconnection weather patterns. At last, this study illustrates how public datasets can be integrated into a machine learning model for heavy rainfall prediction and can interest local policies for management purposes.Keywords: extreme hazards, sensitivity analysis, heavy rainfall, machine learning, sea-atmosphere modeling, precipitation forecasting
Procedia PDF Downloads 1413909 Finite Element Modeling of Integral Abutment Bridge for Lateral Displacement
Authors: M. Naji, A. R. Khalim, M. Naji
Abstract:
Integral Abutment Bridges (IAB) are defined as simple or multiple span bridges in which the bridge deck is cast monolithically with the abutment walls. This kind of bridges are becoming very popular due to different aspects such as good response under seismic loading, low initial costs, elimination of bearings and less maintenance. However, the main issue related to the analysis of this type of structures is dealing with soil-structure interaction of the abutment walls and the supporting piles. A two-dimensional, non-linear finite element (FE) model of an integral abutment bridge has been developed to study the effect of lateral time history displacement loading on the soil system.Keywords: integral abutment bridge, soil structure interaction, finite element modeling, soil-pile interaction
Procedia PDF Downloads 2933908 Empirical Study and Modelling of Three-Dimensional Pedestrian Flow in Railway Foot-Over-Bridge Stair
Authors: Ujjal Chattaraj, M. Raviteja, Chaitanya Aemala
Abstract:
Over the years vehicular traffic has been given priority over pedestrian traffic. With the increase of population in cities, pedestrian traffic is increasing day by day. Pedestrian safety has become a matter of concern for the Traffic Engineers. Pedestrian comfort is primary important for the Engineers who design different pedestrian facilities. Pedestrian comfort and safety can be measured in terms of different level of service (LOS) of the facilities. In this study video data on pedestrian movement have been collected from different railway foot over bridges (FOB) in India. The level of service of those facilities has been analyzed. A cellular automata based model has been formulated to mimic the route choice behaviour of the pedestrians on the foot over bridges.Keywords: cellular automata model, foot over bridge, level of service, pedestrian
Procedia PDF Downloads 267