Search results for: component analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29426

Search results for: component analysis

29096 Effectivity Analysis of The Decontamination Products for Radioactive 99mTc Used in Nuclear Medicine

Authors: Hayrettin Eroglu, Oguz Aksakal

Abstract:

In this study, it is analysed that which decontamination products are more effective and how decontamination process should be performed in the case of contamination of radioactive 99mTc which is the most common radioactive element used in nuclear applications dealing with the human body or the environment. Based on the study, it is observed that existing radioactive washers are less effective than expected, alcohol has no effect on the decontamination of 99mTc, and temperature and pH are the most important factors. In the light of the analysis, it is concluded that the most effective decontamination product is DM-D (Decontamination Material-D). When the effect of DM-D on surfaces is analysed, it is observed that decontamination is very fast on scrubs and formica with both DM-D and water, and although DM-D is very effective on skin, it is not effective on f ceramic tiles and plastic floor covering material. Also in this study, the effectiveness of different molecular groups in the decontaminant was investigated. As a result, the acetate group has been observed as the most effective component of the decontaminant.

Keywords: contamination, radioactive, technetium, decontamination

Procedia PDF Downloads 405
29095 Modeling the Transport of Charge Carriers in the Active Devices MESFET Based of GaInP by the Monte Carlo Method

Authors: N. Massoum, A. Guen. Bouazza, B. Bouazza, A. El Ouchdi

Abstract:

The progress of industry integrated circuits in recent years has been pushed by continuous miniaturization of transistors. With the reduction of dimensions of components at 0.1 micron and below, new physical effects come into play as the standard simulators of two dimensions (2D) do not consider. In fact the third dimension comes into play because the transverse and longitudinal dimensions of the components are of the same order of magnitude. To describe the operation of such components with greater fidelity, we must refine simulation tools and adapted to take into account these phenomena. After an analytical study of the static characteristics of the component, according to the different operating modes, a numerical simulation is performed of field-effect transistor with submicron gate MESFET GaInP. The influence of the dimensions of the gate length is studied. The results are used to determine the optimal geometric and physical parameters of the component for their specific applications and uses.

Keywords: Monte Carlo simulation, transient electron transport, MESFET device, GaInP

Procedia PDF Downloads 420
29094 Aspects of the Detail Design of an Automated Biomethane Test

Authors: Ilias Katsanis, Paraskevas Papanikos, Nikolas Zacharopoulos, Vassilis C. Moulianitis, Evgenios Scourboutis, Diamantis T. Panagiotarakos

Abstract:

This paper presents aspects of the detailed design of an automated biomethane potential measurement system using CAD techniques. First, the design specifications grouped in eight sets that are used to design the design alternatives are briefly presented. Then, the major components of the final concept, as well as the design of the test, are presented. The material selection process is made using ANSYS EduPack database software. The mechanical behavior of one component developed in Creo v.5 is evaluated using finite element analysis. Finally, aspects of software development that integrate the BMP test is finally presented. This paper shows the advantages of CAD techniques in product design applied in the design of a mechatronic product.

Keywords: automated biomethane test, detail mechatronics design, materials selection, mechanical analysis

Procedia PDF Downloads 89
29093 Organizational Learning Strategies for Building Organizational Resilience

Authors: Stephanie K. Douglas, Gordon R. Haley

Abstract:

Organizations face increasing disruptions, changes, and uncertainties through the rapid shifts in the economy and business environment. A capacity for resilience is necessary for organizations to survive and thrive in such adverse conditions. Learning is an essential component of an organization's capability for building resilience. Strategic human resource management is a principal component of learning and organizational resilience. To achieve organizational resilience, human resource management strategies must support individual knowledge, skills, and ability development through organizational learning. This study aimed to contribute to the comprehensive knowledge of the relationship between strategic human resource management and organizational learning to build organizational resilience. The organizational learning dimensions of knowledge acquisition, knowledge distribution, knowledge interpretation, and organizational memory can be fostered through human resource management strategies and then aggregated to the organizational level to build resilience.

Keywords: human resource development, human resource management, organizational learning, organizational resilience

Procedia PDF Downloads 138
29092 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components

Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura

Abstract:

This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.

Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding

Procedia PDF Downloads 138
29091 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 100
29090 Management Practices in Hypertension: Results of Win-Over-A Pan India Registry

Authors: Abhijit Trailokya, Kamlesh Patel

Abstract:

Background: Hypertension is a common disease seen in clinical practice and is associated with high morbidity and mortality. Many patients require combination therapy for the management of hypertension. Objective: To evaluate co-morbidities, risk factors and management practices of hypertension in Indian population. Material and methods: A total of 1596 hypertensive adult patients received anti-hypertensive medications were studied in a cross-sectional, multi-centric, non-interventional, observational registry. Statistical analysis: Categories or nominal data was expressed as numbers with percentages. Continuous variables were analyzed by descriptive statistics using mean, SD, and range Chi square test was used for in between group comparison. Results: The study included 73.50% males and 26.50% females. Overweight (50.50%) and obesity (30.01%) was common in the hypertensive patients (n=903). A total of 54.76% patients had history of smoking. Alcohol use (33.08%), sedentary life style (32.96%) and history of tobacco chewing (17.92%) were the other lifestyle habits of hypertensive patients. Diabetes (36.03%) and dyslipidemia (39.79%) history was common in these patients. Family history of hypertension and diabetes was seen in 82.21% and 45.99% patients respectively. Most (89.16%) patients were treated with combination of antihypertensive agents. ARBs were the by far most commonly used agents (91.98%) followed by calcium channel blockers (68.23%) and diuretics (60.21%). ARB was the most (80.35%) preferred agent as monotherapy. ARB was also the most common agent as a component of dual therapy, four drug and five drug combinations. Conclusion: Most of the hypertensive patients need combination treatment with antihypertensive agents. ARBs are the most preferred agents as monotherapy for the management of hypertension. ARBs are also very commonly used as a component of combination therapy during hypertension management.

Keywords: antihypertensive, hypertension, management, ARB

Procedia PDF Downloads 521
29089 Effects of Small Amount of Poly(D-Lactic Acid) on the Properties of Poly(L-Lactic Acid)/Microcrystalline Cellulose/Poly(D-Lactic Acid) Blends

Authors: Md. Hafezur Rahaman, Md. Sagor Hosen, Md. Abdul Gafur, Rasel Habib

Abstract:

This research is a systematic study of effects of poly(D-lactic acid) (PDLA) on the properties of poly(L-lactic acid)(PLLA)/microcrystalline cellulose (MCC)/PDLA blends by stereo complex crystallization. Blends were prepared with constant percentage of (3 percent) MCC and different percentage of PDLA by solution casting methods. These blends were characterized by Fourier Transform Infrared Spectroscopy (FTIR) for the confirmation of blends compatibility, Wide-Angle X-ray Scattering (WAXS) and scanning electron microscope (SEM) for the analysis of morphology, thermo-gravimetric analysis (TGA) and differential thermal analysis (DTA) for thermal properties measurement. FTIR Analysis results confirm no new characteristic absorption peaks appeared in the spectrum instead shifting of peaks due to hydrogen bonding help to have compatibility of blends component. Development of three new peaks from XRD analysis indicates strongly the formation of stereo complex crystallinity in the PLLA structure with the addition of PDLA. TGA and DTG results indicate that PDLA can improve the heat resistivity of the PLLA/MCC blends by increasing its degradation temperature. Comparison of DTA peaks also ensure developed thermal properties. Image of SEM shows the improvement of surface morphology.

Keywords: microcrystalline cellulose, poly(l-lactic acid), stereocomplex crystallization, thermal stability

Procedia PDF Downloads 137
29088 Video-Based Psychoeducation for Caregivers of Persons with Schizophrenia

Authors: Jilu David

Abstract:

Background: Schizophrenia is one of the most misunderstood mental illnesses across the globe. Lack of understanding about mental illnesses often delay treatment, severely affects the functionality of the person, and causes distress to the family. The study, Video-based Psychoeducation for Caregivers of Persons with Schizophrenia, consisted of developing a psychoeducational video about Schizophrenia, its symptoms, causes, treatment, and the importance of family support. Methodology: A quasi-experimental pre-post design was used to understand the feasibility of the study. Qualitative analysis strengthened the feasibility outcomes. Knowledge About Schizophrenia Interview was used to assess the level of knowledge of 10 participants, before and after the screening of the video. Results: Themes of usefulness, length, content, educational component, format of the intervention, and language emerged in the qualitative analysis. There was a statistically significant difference in the knowledge level of participants before and after the video screening. Conclusion: The statistical and qualitative analysis revealed that the video-based psychoeducation program was feasible and that it facilitated a general improvement in knowledge of the participants.

Keywords: Schizophrenia, mental illness, psychoeducation, video-based psychoeducation, family support

Procedia PDF Downloads 132
29087 Hydrogeochemical Assessment, Evaluation and Characterization of Groundwater Quality in Ore, South-Western, Nigeria

Authors: Olumuyiwa Olusola Falowo

Abstract:

One of the objectives of the Millennium Development Goals is to have sustainable access to safe drinking water and basic sanitation. In line with this objective, an assessment of groundwater quality was carried out in Odigbo Local Government Area of Ondo State in November – February, 2019 to assess the drinking, domestic and irrigation uses of the water. Samples from 30 randomly selected ground water sources; 16 shallow wells and 14 from boreholes and analyzed using American Public Health Association method for the examination of water and wastewater. Water quality index calculation, and diagrams such as Piper diagram, Gibbs diagram and Wilcox diagram have been used to assess the groundwater in conjunction with irrigation indices such as % sodium, sodium absorption ratio, permeability index, magnesium ratio, Kelly ratio, and electrical conductivity. In addition statistical Principal component analysis were used to determine the homogeneity and source(s) influencing the chemistry of the groundwater. The results show that all the parameters are within the permissible limit of World Health Organization. The physico-chemical analysis of groundwater samples indicates that the dominant major cations are in decreasing order of Na+, Ca2+, Mg2+, K+ and the dominant anions are HCO-3, Cl-, SO-24, NO-3. The values of water quality index varies suggest a Good water (WQI of 50-75) accounts for 70% of the study area. The dominant groundwater facies revealed in this study are the non-carbonate alkali (primary salinity) exceeds 50% (zone 7); and transition zone with no one cation-anion pair exceeds 50% (zone 9), while evaporation; rock–water interaction, and precipitation; and silicate weathering process are the dominant processes in the hydrogeochemical evolution of the groundwater. The study indicates that waters were found within the permissible limits of irrigation indices adopted, and plot on excellent category on Wilcox plot. In conclusion, the water in the study area are good/suitable for drinking, domestic and irrigation purposes with low equivalent salinity concentrate and moderate electrical conductivity.

Keywords: equivalent salinity concentration, groundwater quality, hydrochemical facies, principal component analysis, water-rock interaction

Procedia PDF Downloads 151
29086 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring

Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti

Abstract:

Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by density-based time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., mean value, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one class classifier (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, a new anomaly detector strategy is proposed, namely one class classifier neural network two (OCCNN2), which exploit the classification capability of standard classifiers in an anomaly detection problem, finding the standard class (the boundary of the features space in normal operating conditions) through a two-step approach: coarse and fine boundary estimation. The coarse estimation uses classics OCC techniques, while the fine estimation is performed through a feedforward neural network (NN) trained that exploits the boundaries estimated in the coarse step. The detection algorithms vare then compared with known methods based on principal component analysis (PCA), kernel principal component analysis (KPCA), and auto-associative neural network (ANN). In many cases, the proposed solution increases the performance with respect to the standard OCC algorithms in terms of F1 score and accuracy. In particular, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 96% with the proposed method.

Keywords: anomaly detection, frequencies selection, modal analysis, neural network, sensor network, structural health monitoring, vibration measurement

Procedia PDF Downloads 124
29085 Developing a Multidimensional Adjustment Scale

Authors: Nadereh Sohrabi Shegefti, Siamak Samani

Abstract:

Level of adjustment is the first index to check mental health. The aim of this study was developing a valid and reliable Multidimensional Adjustment Scale (MAS). The sample consisted of 150 college students. Multidimensional adjustment scale and Depression, Anxiety, and stress scale (DASS) were used in this study. Principle factor analysis, Pearson correlation coefficient, and Cornbach's Alpha were used to check the validity and reliability of the MAS. Principle component factor analysis showed a 5 factor solution for the MAS. Alpha coefficients for the MAS sub scales were ranged between .69 to .83. Test-retest reliability for MAS was .88 and the mean of sub scales- total score correlation was .88. All these indexes revealed an acceptable reliability and validity for the MAS. The MAS is a short assessment instrument with good acceptable psychometric properties to use in clinical filed.

Keywords: psychological adjustment, psychometric properties, validity, Pearson correlation

Procedia PDF Downloads 635
29084 Review of Cyber Security in Oil and Gas Industry with Cloud Computing Perspective: Taxonomy, Issues and Future Direction

Authors: Irfan Mohiuddin, Ahmad Al Mogren

Abstract:

In recent years, cloud computing has earned substantial attention in the Oil and Gas Industry and provides services in all the phases of the industry lifecycle. Oil and gas supply infrastructure, in particular, is more vulnerable to accidental, natural and intentional threats because of its widespread distribution. Numerous surveys have been conducted on cloud security and privacy. However, to the best of our knowledge, hardly any survey is carried out that reviews cyber security in all phases with a cloud computing perspective. Moreover, a distinctive classification is performed for all the cloud-based cyber security measures based on the cloud component in use. The classification approach will enable researchers to identify the required technique used to enhance the security in specific cloud components. Also, the limitation of each component will allow the researchers to design optimal algorithms. Lastly, future directions are given to point out the imminent challenges that can pave the way for researchers to further enhance the resilience to cyber security threats in the oil and gas industry.

Keywords: cyber security, cloud computing, safety and security, oil and gas industry, security threats, oil and gas pipelines

Procedia PDF Downloads 143
29083 Implementing Service Learning in the Health Education Curriculum

Authors: Karen Butler

Abstract:

Johnson C. Smith University, one of the nation’s oldest Historically Black Colleges and Universities, has a strong history of service learning and community service. We first integrated service learning and peer education into health education courses in the spring of 2000. Students enrolled in the classes served as peer educators for the semester. Since then, the program has evolved and expanded but remains an integral part of several courses. The purpose of this session is to describe our program in terms of development, successes, and obstacles, and feedback received. A detailed description of the service learning component in HED 235: Drugs and Drug Education and HED 337: Environmental Health will be provided. These classes are required of our Community Health majors but are also popular electives for students in other disciplines. Three sources of student feedback were used to evaluate and continually modify the component: the SIR II course evaluation, service learning reflection papers, and focus group interviews. Student feedback has been largely positive. When criticism was given, it was thoughtful and constructive – given in the spirit of making it better for the next group. Students consistently agreed that the service learning program increased their awareness of pertinent health issues; that both the service providers and service recipients benefited from the project; and that the goals/issues targeted by the service learning component fit the objectives of the course. Also, evidence of curriculum and learning enhancement was found in the reflection papers and focus group sessions. Service learning sets up a win-win situation. It provides a way to respond to campus and community health needs while enhancing the curriculum, as students learn more by doing things that benefit the health and wellness of others. Service learning is suitable for any health education course and any target audience would welcome the effort.

Keywords: black colleges, community health, health education, service learning

Procedia PDF Downloads 342
29082 Assessing Environmental Urban Sustainability Using Multivariate Analysis: A Case of Nagpur, India

Authors: Anusha Vaddiraj Pallapu

Abstract:

Measuring urban sustainable development is at the forefront in contributing to overall sustainability, and it refers to attaining social equity, environmental protection and minimizing the impacts of urbanization. Assessing performance of urban issues ranging from larger consumption of natural resources by humans in terms of lifestyle to creating a polluted nearby environment, social and even economic dimensions of sustainability major issues observed such as water quality, transportation, management of solid waste and traffic pollution. However, relying on the framework of the project to do the goals of sustainable development or minimization of urban impacts through management practices is not enough to deal with the present urban issues. The aim of the sustainability is to know how severely the resources are depleted because of human consumption and how issues are characterized. The paper aims to assign benchmarks for the selected sustainability indicators for research, and analysis is done through multivariate analysis in Indian context a case of Nagpur city to identify the play role of each urban issues in the overall sustainability. The main objectives of this paper are to examine the indicators over by time basis on various scenarios and how benchmarking is used, what and which categories of values should be considered as the performance of indicators function.

Keywords: environmental sustainability indicators, principal component analysis, urban sustainability, urban clusters, benchmarking

Procedia PDF Downloads 344
29081 Application of Adaptive Neural Network Algorithms for Determination of Salt Composition of Waters Using Laser Spectroscopy

Authors: Tatiana A. Dolenko, Sergey A. Burikov, Alexander O. Efitorov, Sergey A. Dolenko

Abstract:

In this study, a comparative analysis of the approaches associated with the use of neural network algorithms for effective solution of a complex inverse problem – the problem of identifying and determining the individual concentrations of inorganic salts in multicomponent aqueous solutions by the spectra of Raman scattering of light – is performed. It is shown that application of artificial neural networks provides the average accuracy of determination of concentration of each salt no worse than 0.025 M. The results of comparative analysis of input data compression methods are presented. It is demonstrated that use of uniform aggregation of input features allows decreasing the error of determination of individual concentrations of components by 16-18% on the average.

Keywords: inverse problems, multi-component solutions, neural networks, Raman spectroscopy

Procedia PDF Downloads 529
29080 Contribution to the Understanding of the Hydrodynamic Behaviour of Aquifers of the Taoudéni Sedimentary Basin (South-eastern Part, Burkina Faso)

Authors: Kutangila Malundama Succes, Koita Mahamadou

Abstract:

In the context of climate change and demographic pressure, groundwater has emerged as an essential and strategic resource whose sustainability relies on good management. The accuracy and relevance of decisions made in managing these resources depend on the availability and quality of scientific information they must rely on. It is, therefore, more urgent to improve the state of knowledge on groundwater to ensure sustainable management. This study is conducted for the particular case of the aquifers of the transboundary sedimentary basin of Taoudéni in its Burkinabe part. Indeed, Burkina Faso (and the Sahel region in general), marked by low rainfall, has experienced episodes of severe drought, which have justified the use of groundwater as the primary source of water supply. This study aims to improve knowledge of the hydrogeology of this area to achieve sustainable management of transboundary groundwater resources. The methodological approach first described lithological units regarding the extension and succession of different layers. Secondly, the hydrodynamic behavior of these units was studied through the analysis of spatio-temporal variations of piezometric. The data consists of 692 static level measurement points and 8 observation wells located in the usual manner in the area and capturing five of the identified geological formations. Monthly piezometric level chronicles are available for each observation and cover the period from 1989 to 2020. The temporal analysis of piezometric, carried out in comparison with rainfall chronicles, revealed a general upward trend in piezometric levels throughout the basin. The reaction of the groundwater generally occurs with a delay of 1 to 2 months relative to the flow of the rainy season. Indeed, the peaks of the piezometric level generally occur between September and October in reaction to the rainfall peaks between July and August. Low groundwater levels are observed between May and July. This relatively slow reaction of the aquifer is observed in all wells. The influence of the geological nature through the structure and hydrodynamic properties of the layers was deduced. The spatial analysis reveals that piezometric contours vary between 166 and 633 m with a trend indicating flow that generally goes from southwest to northeast, with the feeding areas located towards the southwest and northwest. There is a quasi-concordance between the hydrogeological basins and the overlying hydrological basins, as well as a bimodal flow with a component following the topography and another significant component deeper, controlled by the regional gradient SW-NE. This latter component may present flows directed from the high reliefs towards the sources of Nasso. In the source area (Kou basin), the maximum average stock variation, calculated by the Water Table Fluctuation (WTF) method, varies between 35 and 48.70 mm per year for 2012-2014.

Keywords: hydrodynamic behaviour, taoudeni basin, piezometry, water table fluctuation

Procedia PDF Downloads 65
29079 Analysis of Economics and Value Addition of Optimized Blend with Petrodiesel of Nanocomposite Oil Methyl Esters

Authors: Chandrashekara Krishnappa, Yogish Huchaiah

Abstract:

The present work considers the importance of economic feasibility and financial viability of biodiesel production, and its use in the present context of prevailing Indian scenario. For this, costs involved in production of one litre of biodiesel from non-edible Jatropha and Pongamia oils Nano mix are considered. Biodiesel derived from the mix is blended with petrodiesel in various proportions and used in Compression Ignition (CI) Direct Injection (DI) engine. Performance and Emission characteristics were investigated. Optimization of the blends considering experimental results was carried out. To validate the experimental results and optimization, Multi-Functional Criteria Technique (MFCT) is used. Further, value additions in terms of INR due to increase in performance and reduction in emissions are investigated. Cost component of subsidy on petrodiesel is taken into consideration in the calculation of cost of one litre of it. Comparison of costs is with respect to the unit of power generated per litre of COME and petrodiesel. By the analysis it has been concluded that the amount saved with subsidy is INR 1.45 Lakh Crores per year and it is INR1.60 Lakh Crores per year without subsidy for petrodiesel.

Keywords: cap value addition, economic analysis, MFCT, NACOME, subsidy

Procedia PDF Downloads 241
29078 Root Cause Analysis of a Catastrophically Failed Output Pin Bush Coupling of a Raw Material Conveyor Belt

Authors: Kaushal Kishore, Suman Mukhopadhyay, Susovan Das, Manashi Adhikary, Sandip Bhattacharyya

Abstract:

In integrated steel plants, conveyor belts are widely used for transferring raw materials from one location to another. An output pin bush coupling attached with a conveyor transferring iron ore fines and fluxes failed after two years of service life. This led to an operational delay of approximately 15 hours. This study is focused on failure analysis of the coupling and recommending counter-measures to prevent any such failures in the future. Investigation consisted of careful visual observation, checking of operating parameters, stress calculation and analysis, macro and micro-fractography, material characterizations like chemical and metallurgical analysis and tensile and impact testings. The fracture occurred from an unusually sharp double step. There were multiple corrosion pits near the step that aggravated the situation. Inner contact surface of the coupling revealed differential abrasion that created a macroscopic difference in the height of the component. This pointed towards misalignment of the coupling beyond a threshold limit. In addition to these design and installation issues, material of the coupling did not meet the quality standards. These were made up of grey cast iron having graphite morphology intermediate between random distribution (Type A) and rosette pattern (Type B). This manifested as a marked reduction in impact toughness and tensile strength of the component. These findings corroborated well with the brittle mode of fracture that might have occurred during minor impact loading while loading of conveyor belt with raw materials from height. Simulated study was conducted to examine the effect of corrosion pits on tensile and impact toughness of grey cast iron. It was observed that pitting marginally reduced tensile strength and ductility. However, there was marked (up to 45%) reduction in impact toughness due to pitting. Thus, it became evident that failure of the coupling occurred due to combination of factors like inferior material, misalignment, poor step design and corrosion pitting. Recommendation for life enhancement of coupling included the use of tougher SG 500/7 grade, incorporation of proper fillet radius for the step, correction of alignment and application of corrosion resistant organic coating to prevent pitting.

Keywords: brittle fracture, cast iron, coupling, double step, pitting, simulated impact tests

Procedia PDF Downloads 133
29077 Study on Dynamic Stiffness Matching and Optimization Design Method of a Machine Tool

Authors: Lu Xi, Li Pan, Wen Mengmeng

Abstract:

The stiffness of each component has different influences on the stiffness of the machine tool. Taking the five-axis gantry machining center as an example, we made the modal analysis of the machine tool, followed by raising and lowering the stiffness of the pillar, slide plate, beam, ram and saddle so as to study the stiffness matching among these components on the standard of whether the stiffness of the modified machine tool changes more than 50% relative to the stiffness of the original machine tool. The structural optimization of the machine tool can be realized by changing the stiffness of the components whose stiffness is mismatched. For example, the stiffness of the beam is mismatching. The natural frequencies of the first six orders of the beam increased by 7.70%, 0.38%, 6.82%, 7.96%, 18.72% and 23.13%, with the weight increased by 28Kg, leading to the natural frequencies of several orders which had a great influence on the dynamic performance of the whole machine increased by 1.44%, 0.43%, 0.065%, which verified the correctness of the optimization method based on stiffness matching proposed in this paper.

Keywords: machine tool, optimization, modal analysis, stiffness matching

Procedia PDF Downloads 103
29076 An Exploration on Competency-Based Curricula in Integrated Circuit Design

Authors: Chih Chin Yang, Chung Shan Sun

Abstract:

In this paper, the relationships between professional competences and school curricula in IC design industry are explored. The semi-structured questionnaire survey and focus group interview is the research method. Study participants are graduates of microelectronics engineering professional departments who are currently employed in the IC industry. The IC industries are defined as the electronic component manufacturing industry and optical-electronic component manufacturing industry in the semiconductor industry and optical-electronic material devices, respectively. Study participants selected from IC design industry include IC engineering and electronic & semiconductor engineering. The human training with IC design professional competence in microelectronics engineering professional departments is explored in this research. IC professional competences of human resources in the IC design industry include general intelligence and professional intelligence.

Keywords: IC design, curricula, competence, task, duty

Procedia PDF Downloads 382
29075 Intrusion Detection System Using Linear Discriminant Analysis

Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou

Abstract:

Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.

Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99

Procedia PDF Downloads 228
29074 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant

Authors: Michael Smalenberger

Abstract:

Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.

Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation

Procedia PDF Downloads 174
29073 Residual Analysis and Ground Motion Prediction Equation Ranking Metrics for Western Balkan Strong Motion Database

Authors: Manuela Villani, Anila Xhahysa, Christopher Brooks, Marco Pagani

Abstract:

The geological structure of Western Balkans is strongly affected by the collision between Adria microplate and the southwestern Euroasia margin, resulting in a considerably active seismic region. The Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project (BSHAP) (2007-2011, 2012-2015) by NATO supported the preparation of new seismic hazard maps of the Western Balkan, but when inspecting the seismic hazard models produced later by these countries on a national scale, significant differences in design PGA values are observed in the border, for instance, North Albania-Montenegro, South Albania- Greece, etc. Considering the fact that the catalogues were unified and seismic sources were defined within BSHAP framework, obviously, the differences arise from the Ground Motion Prediction Equations selection, which are generally the component with highest impact on the seismic hazard assessment. At the time of the project, a modest database was present, namely 672 three-component records, whereas nowadays, this strong motion database has increased considerably up to 20,939 records with Mw ranging in the interval 3.7-7 and epicentral distance distribution from 0.47km to 490km. Statistical analysis of the strong motion database showed the lack of recordings in the moderate-to-large magnitude and short distance ranges; therefore, there is need to re-evaluate the Ground Motion Prediction Equation in light of the recently updated database and the new generations of GMMs. In some cases, it was observed that some events were more extensively documented in one database than the other, like the 1979 Montenegro earthquake, with a considerably larger number of records in the BSHAP Analogue SM database when compared to ESM23. Therefore, the strong motion flat-file provided from the Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project was merged with the ESM23 database for the polygon studied in this project. After performing the preliminary residual analysis, the candidate GMPE-s were identified. This process was done using the GMPE performance metrics available within the SMT in the OpenQuake Platform. The Likelihood Model and Euclidean Distance Based Ranking (EDR) were used. Finally, for this study, a GMPE logic tree was selected and following the selection of candidate GMPEs, model weights were assigned using the average sample log-likelihood approach of Scherbaum.

Keywords: residual analysis, GMPE, western balkan, strong motion, openquake

Procedia PDF Downloads 90
29072 Structural and Modal Analyses of an s1223 High-Lift Airfoil Wing for Drone Design

Authors: Johnson Okoduwa Imumbhon, Mohammad Didarul Alam, Yiding Cao

Abstract:

Structural analyses are commonly employed to test the integrity of aircraft component systems in the design stage to demonstrate the capability of the structural components to withstand what it was designed for, as well as to predict potential failure of the components. The analyses are also essential for weight minimization and selecting the most resilient materials that will provide optimal outcomes. This research focuses on testing the structural nature of a high-lift low Reynolds number airfoil profile design, the Selig S1223, under certain loading conditions for a drone model application. The wing (ribs, spars, and skin) of the drone model was made of carbon fiber-reinforced polymer and designed in SolidWorks, while the finite element analysis was carried out in ANSYS mechanical in conjunction with the lift and drag forces that were derived from the aerodynamic airfoil analysis. Additionally, modal analysis was performed to calculate the natural frequencies and the mode shapes of the wing structure. The structural strain and stress determined the minimal deformations under the wing loading conditions, and the modal analysis showed the prominent modes that were excited by the given forces. The research findings from the structural analysis of the S1223 high-lift airfoil indicated that it is applicable for use in an unmanned aerial vehicle as well as a novel reciprocating-airfoil-driven vertical take-off and landing (VTOL) drone model.

Keywords: CFRP, finite element analysis, high-lift, S1223, strain, stress, VTOL

Procedia PDF Downloads 230
29071 Modeling of System Availability and Bayesian Analysis of Bivariate Distribution

Authors: Muhammad Farooq, Ahtasham Gul

Abstract:

To meet the desired standard, it is important to monitor and analyze different engineering processes to get desired output. The bivariate distributions got a lot of attention in recent years to describe the randomness of natural as well as artificial mechanisms. In this article, a bivariate model is constructed using two independent models developed by the nesting approach to study the effect of each component on reliability for better understanding. Further, the Bayes analysis of system availability is studied by considering prior parametric variations in the failure time and repair time distributions. Basic statistical characteristics of marginal distribution, like mean median and quantile function, are discussed. We use inverse Gamma prior to study its frequentist properties by conducting Monte Carlo Markov Chain (MCMC) sampling scheme.

Keywords: reliability, system availability Weibull, inverse Lomax, Monte Carlo Markov Chain, Bayesian

Procedia PDF Downloads 74
29070 Multisensory Science, Technology, Engineering and Mathematics Learning: Combined Hands-on and Virtual Science for Distance Learners of Food Chemistry

Authors: Paulomi Polly Burey, Mark Lynch

Abstract:

It has been shown that laboratory activities can help cement understanding of theoretical concepts, but it is difficult to deliver such an activity to an online cohort and issues such as occupational health and safety in the students’ learning environment need to be considered. Chemistry, in particular, is one of the sciences where practical experience is beneficial for learning, however typical university experiments may not be suitable for the learning environment of a distance learner. Food provides an ideal medium for demonstrating chemical concepts, and along with a few simple physical and virtual tools provided by educators, analytical chemistry can be experienced by distance learners. Food chemistry experiments were designed to be carried out in a home-based environment that 1) Had sufficient scientific rigour and skill-building to reinforce theoretical concepts; 2) Were safe for use at home by university students and 3) Had the potential to enhance student learning by linking simple hands-on laboratory activities with high-level virtual science. Two main components of the resources were developed, a home laboratory experiment component, and a virtual laboratory component. For the home laboratory component, students were provided with laboratory kits, as well as a list of supplementary inexpensive chemical items that they could purchase from hardware stores and supermarkets. The experiments used were typical proximate analyses of food, as well as experiments focused on techniques such as spectrophotometry and chromatography. Written instructions for each experiment coupled with video laboratory demonstrations were used to train students on appropriate laboratory technique. Data that students collected in their home laboratory environment was collated across the class through shared documents, so that the group could carry out statistical analysis and experience a full laboratory experience from their own home. For the virtual laboratory component, students were able to view a laboratory safety induction and advised on good characteristics of a home laboratory space prior to carrying out their experiments. Following on from this activity, students observed laboratory demonstrations of the experimental series they would carry out in their learning environment. Finally, students were embedded in a virtual laboratory environment to experience complex chemical analyses with equipment that would be too costly and sensitive to be housed in their learning environment. To investigate the impact of the intervention, students were surveyed before and after the laboratory series to evaluate engagement and satisfaction with the course. Students were also assessed on their understanding of theoretical chemical concepts before and after the laboratory series to determine the impact on their learning. At the end of the intervention, focus groups were run to determine which aspects helped and hindered learning. It was found that the physical experiments helped students to understand laboratory technique, as well as methodology interpretation, particularly if they had not been in such a laboratory environment before. The virtual learning environment aided learning as it could be utilized for longer than a typical physical laboratory class, thus allowing further time on understanding techniques.

Keywords: chemistry, food science, future pedagogy, STEM education

Procedia PDF Downloads 169
29069 Statistical Analysis of Surface Roughness and Tool Life Using (RSM) in Face Milling

Authors: Mohieddine Benghersallah, Lakhdar Boulanouar, Salim Belhadi

Abstract:

Currently, higher production rate with required quality and low cost is the basic principle in the competitive manufacturing industry. This is mainly achieved by using high cutting speed and feed rates. Elevated temperatures in the cutting zone under these conditions shorten tool life and adversely affect the dimensional accuracy and surface integrity of component. Thus it is necessary to find optimum cutting conditions (cutting speed, feed rate, machining environment, tool material and geometry) that can produce components in accordance with the project and having a relatively high production rate. Response surface methodology is a collection of mathematical and statistical techniques that are useful for modelling and analysis of problems in which a response of interest is influenced by several variables and the objective is to optimize this response. The work presented in this paper examines the effects of cutting parameters (cutting speed, feed rate and depth of cut) on to the surface roughness through the mathematical model developed by using the data gathered from a series of milling experiments performed.

Keywords: Statistical analysis (RSM), Bearing steel, Coating inserts, Tool life, Surface Roughness, End milling.

Procedia PDF Downloads 432
29068 Analysis of Matching Pursuit Features of EEG Signal for Mental Tasks Classification

Authors: Zin Mar Lwin

Abstract:

Brain Computer Interface (BCI) Systems have developed for people who suffer from severe motor disabilities and challenging to communicate with their environment. BCI allows them for communication by a non-muscular way. For communication between human and computer, BCI uses a type of signal called Electroencephalogram (EEG) signal which is recorded from the human„s brain by means of an electrode. The electroencephalogram (EEG) signal is an important information source for knowing brain processes for the non-invasive BCI. Translating human‟s thought, it needs to classify acquired EEG signal accurately. This paper proposed a typical EEG signal classification system which experiments the Dataset from “Purdue University.” Independent Component Analysis (ICA) method via EEGLab Tools for removing artifacts which are caused by eye blinks. For features extraction, the Time and Frequency features of non-stationary EEG signals are extracted by Matching Pursuit (MP) algorithm. The classification of one of five mental tasks is performed by Multi_Class Support Vector Machine (SVM). For SVMs, the comparisons have been carried out for both 1-against-1 and 1-against-all methods.

Keywords: BCI, EEG, ICA, SVM

Procedia PDF Downloads 279
29067 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 181