Search results for: component analysis
28589 Exploring the Use of Augmented Reality for Laboratory Lectures in Distance Learning
Authors: Michele Gattullo, Vito M. Manghisi, Alessandro Evangelista, Enricoandrea Laviola
Abstract:
In this work, we explored the use of Augmented Reality (AR) to support students in laboratory lectures in Distance Learning (DL), designing an application that proved to be ready for use next semester. AR could help students in the understanding of complex concepts as well as increase their motivation in the learning process. However, despite many prototypes in the literature, it is still less used in schools and universities. This is mainly due to the perceived limited advantages to the investment costs, especially regarding changes needed in the teaching modalities. However, with the spread of epidemiological emergency due to SARS-CoV-2, schools and universities were forced to a very rapid redefinition of consolidated processes towards forms of Distance Learning. Despite its many advantages, it suffers from the impossibility to carry out practical activities that are of crucial importance in STEM ("Science, Technology, Engineering e Math") didactics. In this context, AR perceived advantages increased a lot since teachers are more prepared for new teaching modalities, exploiting AR that allows students to carry on practical activities on their own instead of being physically present in laboratories. In this work, we designed an AR application for the support of engineering students in the understanding of assembly drawings of complex machines. Traditionally, this skill is acquired in the first years of the bachelor's degree in industrial engineering, through laboratory activities where the teacher shows the corresponding components (e.g., bearings, screws, shafts) in a real machine and their representation in the assembly drawing. This research aims to explore the effectiveness of AR to allow students to acquire this skill on their own without physically being in the laboratory. In a preliminary phase, we interviewed students to understand the main issues in the learning of this subject. This survey revealed that students had difficulty identifying machine components in an assembly drawing, matching between the 2D representation of a component and its real shape, and understanding the functionality of a component within the machine. We developed a mobile application using Unity3D, aiming to solve the mentioned issues. We designed the application in collaboration with the course professors. Natural feature tracking was used to associate the 2D printed assembly drawing with the corresponding 3D virtual model. The application can be displayed on students’ tablets or smartphones. Users could interact with selecting a component from a part list on the device. Then, 3D representations of components appear on the printed drawing, coupled with 3D virtual labels for their location and identification. Users could also interact with watching a 3D animation to learn how components are assembled. Students evaluated the application through a questionnaire based on the System Usability Scale (SUS). The survey was provided to 15 students selected among those we participated in the preliminary interview. The mean SUS score was 83 (SD 12.9) over a maximum of 100, allowing teachers to use the AR application in their courses. Another important finding is that almost all the students revealed that this application would provide significant power for comprehension on their own.Keywords: augmented reality, distance learning, STEM didactics, technology in education
Procedia PDF Downloads 13028588 Machine Learning Techniques for COVID-19 Detection: A Comparative Analysis
Authors: Abeer A. Aljohani
Abstract:
COVID-19 virus spread has been one of the extreme pandemics across the globe. It is also referred to as coronavirus, which is a contagious disease that continuously mutates into numerous variants. Currently, the B.1.1.529 variant labeled as omicron is detected in South Africa. The huge spread of COVID-19 disease has affected several lives and has surged exceptional pressure on the healthcare systems worldwide. Also, everyday life and the global economy have been at stake. This research aims to predict COVID-19 disease in its initial stage to reduce the death count. Machine learning (ML) is nowadays used in almost every area. Numerous COVID-19 cases have produced a huge burden on the hospitals as well as health workers. To reduce this burden, this paper predicts COVID-19 disease is based on the symptoms and medical history of the patient. This research presents a unique architecture for COVID-19 detection using ML techniques integrated with feature dimensionality reduction. This paper uses a standard UCI dataset for predicting COVID-19 disease. This dataset comprises symptoms of 5434 patients. This paper also compares several supervised ML techniques to the presented architecture. The architecture has also utilized 10-fold cross validation process for generalization and the principal component analysis (PCA) technique for feature reduction. Standard parameters are used to evaluate the proposed architecture including F1-Score, precision, accuracy, recall, receiver operating characteristic (ROC), and area under curve (AUC). The results depict that decision tree, random forest, and neural networks outperform all other state-of-the-art ML techniques. This achieved result can help effectively in identifying COVID-19 infection cases.Keywords: supervised machine learning, COVID-19 prediction, healthcare analytics, random forest, neural network
Procedia PDF Downloads 9428587 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators
Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros
Abstract:
Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis
Procedia PDF Downloads 14228586 Thermal Stress and Computational Fluid Dynamics Analysis of Coatings for High-Temperature Corrosion
Authors: Ali Kadir, O. Anwar Beg
Abstract:
Thermal barrier coatings are among the most popular methods for providing corrosion protection in high temperature applications including aircraft engine systems, external spacecraft structures, rocket chambers etc. Many different materials are available for such coatings, of which ceramics generally perform the best. Motivated by these applications, the current investigation presents detailed finite element simulations of coating stress analysis for a 3- dimensional, 3-layered model of a test sample representing a typical gas turbine component scenario. Structural steel is selected for the main inner layer, Titanium (Ti) alloy for the middle layer and Silicon Carbide (SiC) for the outermost layer. The model dimensions are 20 mm (width), 10 mm (height) and three 1mm deep layers. ANSYS software is employed to conduct three types of analysis- static structural, thermal stress analysis and also computational fluid dynamic erosion/corrosion analysis (via ANSYS FLUENT). The specified geometry which corresponds to corrosion test samples exactly is discretized using a body-sizing meshing approach, comprising mainly of tetrahedron cells. Refinements were concentrated at the connection points between the layers to shift the focus towards the static effects dissipated between them. A detailed grid independence study is conducted to confirm the accuracy of the selected mesh densities. To recreate gas turbine scenarios; in the stress analysis simulations, static loading and thermal environment conditions of up to 1000 N and 1000 degrees Kelvin are imposed. The default solver was used to set the controls for the simulation with the fixed support being set as one side of the model while subjecting the opposite side to a tabular force of 500 and 1000 Newtons. Equivalent elastic strain, total deformation, equivalent stress and strain energy were computed for all cases. Each analysis was duplicated twice to remove one of the layers each time, to allow testing of the static and thermal effects with each of the coatings. ANSYS FLUENT simulation was conducted to study the effect of corrosion on the model under similar thermal conditions. The momentum and energy equations were solved and the viscous heating option was applied to represent improved thermal physics of heat transfer between the layers of the structures. A Discrete Phase Model (DPM) in ANSYS FLUENT was employed which allows for the injection of continuous uniform air particles onto the model, thereby enabling an option for calculating the corrosion factor caused by hot air injection (particles prescribed 5 m/s velocity and 1273.15 K). Extensive visualization of results is provided. The simulations reveal interesting features associated with coating response to realistic gas turbine loading conditions including significantly different stress concentrations with different coatings.Keywords: thermal coating, corrosion, ANSYS FEA, CFD
Procedia PDF Downloads 13728585 A Paradigm Shift towards Personalized and Scalable Product Development and Lifecycle Management Systems in the Aerospace Industry
Authors: David E. Culler, Noah D. Anderson
Abstract:
Integrated systems for product design, manufacturing, and lifecycle management are difficult to implement and customize. Commercial software vendors, including CAD/CAM and third party PDM/PLM developers, create user interfaces and functionality that allow their products to be applied across many industries. The result is that systems become overloaded with functionality, difficult to navigate, and use terminology that is unfamiliar to engineers and production personnel. For example, manufacturers of automotive, aeronautical, electronics, and household products use similar but distinct methods and processes. Furthermore, each company tends to have their own preferred tools and programs for controlling work and information flow and that connect design, planning, and manufacturing processes to business applications. This paper presents a methodology and a case study that addresses these issues and suggests that in the future more companies will develop personalized applications that fit to the natural way that their business operates. A functioning system has been implemented at a highly competitive U.S. aerospace tooling and component supplier that works with many prominent airline manufacturers around the world including The Boeing Company, Airbus, Embraer, and Bombardier Aerospace. During the last three years, the program has produced significant benefits such as the automatic creation and management of component and assembly designs (parametric models and drawings), the extensive use of lightweight 3D data, and changes to the way projects are executed from beginning to end. CATIA (CAD/CAE/CAM) and a variety of programs developed in C#, VB.Net, HTML, and SQL make up the current system. The web-based platform is facilitating collaborative work across multiple sites around the world and improving communications with customers and suppliers. This work demonstrates that the creative use of Application Programming Interface (API) utilities, libraries, and methods is a key to automating many time-consuming tasks and linking applications together.Keywords: PDM, PLM, collaboration, CAD/CAM, scalable systems
Procedia PDF Downloads 17628584 Adsorbed Probe Molecules on Surface for Analyzing the Properties of Cu/SnO2 Supported Catalysts
Authors: Neha Thakur, Pravin S. More
Abstract:
The interaction of CO, H2 and LPG with Cu-dosed SnO2 catalysts was studied by means of Fourier transform infrared spectroscopy (FTIR). With increasing Cu loading, pronounced and progressive red shifts of the C–O stretching frequency associated with molecular CO adsorbed on the Cu/SnO2 component were observed. This decrease in n(CO) correlates with enhancement of CO dissociation at higher temperatures on Cu promoted SnO2 catalysts under conditions, where clean Cu is almost ineffective. In the conclusion, the capability of our technique is discussed, and a technique for enhancing the sensitivity in our technique is proposed.Keywords: FTIR, spectroscopic, dissociation, n(CO)
Procedia PDF Downloads 30528583 Hydrographic Mapping Based on the Concept of Fluvial-Geomorphological Auto-Classification
Authors: Jesús Horacio, Alfredo Ollero, Víctor Bouzas-Blanco, Augusto Pérez-Alberti
Abstract:
Rivers have traditionally been classified, assessed and managed in terms of hydrological, chemical and / or biological criteria. Geomorphological classifications had in the past a secondary role, although proposals like River Styles Framework, Catchment Baseline Survey or Stroud Rural Sustainable Drainage Project did incorporate geomorphology for management decision-making. In recent years many studies have been attracted to the geomorphological component. The geomorphological processes and their associated forms determine the structure of a river system. Understanding these processes and forms is a critical component of the sustainable rehabilitation of aquatic ecosystems. The fluvial auto-classification approach suggests that a river is a self-built natural system, with processes and forms designed to effectively preserve their ecological function (hydrologic, sedimentological and biological regime). Fluvial systems are formed by a wide range of elements with multiple non-linear interactions on different spatial and temporal scales. Besides, the fluvial auto-classification concept is built using data from the river itself, so that each classification developed is peculiar to the river studied. The variables used in the classification are specific stream power and mean grain size. A discriminant analysis showed that these variables are the best characterized processes and forms. The statistical technique applied allows to get an individual discriminant equation for each geomorphological type. The geomorphological classification was developed using sites with high naturalness. Each site is a control point of high ecological and geomorphological quality. The changes in the conditions of the control points will be quickly recognizable, and easy to apply a right management measures to recover the geomorphological type. The study focused on Galicia (NW Spain) and the mapping was made analyzing 122 control points (sites) distributed over eight river basins. In sum, this study provides a method for fluvial geomorphological classification that works as an open and flexible tool underlying the fluvial auto-classification concept. The hydrographic mapping is the visual expression of the results, such that each river has a particular map according to its geomorphological characteristics. Each geomorphological type is represented by a particular type of hydraulic geometry (channel width, width-depth ratio, hydraulic radius, etc.). An alteration of this geometry is indicative of a geomorphological disturbance (whether natural or anthropogenic). Hydrographic mapping is also dynamic because its meaning changes if there is a modification in the specific stream power and/or the mean grain size, that is, in the value of their equations. The researcher has to check annually some of the control points. This procedure allows to monitor the geomorphology quality of the rivers and to see if there are any alterations. The maps are useful to researchers and managers, especially for conservation work and river restoration.Keywords: fluvial auto-classification concept, mapping, geomorphology, river
Procedia PDF Downloads 36728582 The Application of Raman Spectroscopy in Olive Oil Analysis
Authors: Silvia Portarena, Chiara Anselmi, Chiara Baldacchini, Enrico Brugnoli
Abstract:
Extra virgin olive oil (EVOO) is a complex matrix mainly composed by fatty acid and other minor compounds, among which carotenoids are well known for their antioxidative function that is a key mechanism of protection against cancer, cardiovascular diseases, and macular degeneration in humans. EVOO composition in terms of such constituents is generally the result of a complex combination of genetic, agronomical and environmental factors. To selectively improve the quality of EVOOs, the role of each factor on its biochemical composition need to be investigated. By selecting fruits from four different cultivars similarly grown and harvested, it was demonstrated that Raman spectroscopy, combined with chemometric analysis, is able to discriminate the different cultivars, also as a function of the harvest date, based on the relative content and composition of fatty acid and carotenoids. In particular, a correct classification up to 94.4% of samples, according to the cultivar and the maturation stage, was obtained. Moreover, by using gas chromatography and high-performance liquid chromatography as reference techniques, the Raman spectral features further allowed to build models, based on partial least squares regression, that were able to predict the relative amount of the main fatty acids and the main carotenoids in EVOO, with high coefficients of determination. Besides genetic factors, climatic parameters, such as light exposition, distance from the sea, temperature, and amount of precipitations could have a strong influence on EVOO composition of both major and minor compounds. This suggests that the Raman spectra could act as a specific fingerprint for the geographical discrimination and authentication of EVOO. To understand the influence of environment on EVOO Raman spectra, samples from seven regions along the Italian coasts were selected and analyzed. In particular, it was used a dual approach combining Raman spectroscopy and isotope ratio mass spectrometry (IRMS) with principal component and linear discriminant analysis. A correct classification of 82% EVOO based on their regional geographical origin was obtained. Raman spectra were obtained by Super Labram spectrometer equipped with an Argon laser (514.5 nm wavelenght). Analyses of stable isotope content ratio were performed using an isotope ratio mass spectrometer connected to an elemental analyzer and to a pyrolysis system. These studies demonstrate that RR spectroscopy is a valuable and useful technique for the analysis of EVOO. In combination with statistical analysis, it makes possible the assessment of specific samples’ content and allows for classifying oils according to their geographical and varietal origin.Keywords: authentication, chemometrics, olive oil, raman spectroscopy
Procedia PDF Downloads 33228581 The Impact of Experiential Learning on the Success of Upper Division Mechanical Engineering Students
Authors: Seyedali Seyedkavoosi, Mohammad Obadat, Seantorrion Boyle
Abstract:
The purpose of this study is to assess the effectiveness of a nontraditional experiential learning strategy in improving the success and interest of mechanical engineering students, using the Kinematics/Dynamics of Machine course as a case study. This upper-division technical course covers a wide range of topics, including mechanism and machine system analysis and synthesis, yet the complexities of ideas like acceleration, motion, and machine component relationships are hard to explain using standard teaching techniques. To solve this problem, a thorough design project was created that gave students hands-on experience developing, manufacturing, and testing their inventions. The main goals of the project were to improve students' grasp of machine design and kinematics, to develop problem-solving and presenting abilities, and to familiarize them with professional software. A questionnaire survey was done to evaluate the effect of this technique on students' performance and interest in mechanical engineering. The outcomes of the study shed light on the usefulness of nontraditional experiential learning approaches in engineering education.Keywords: experiential learning, nontraditional teaching, hands-on design project, engineering education
Procedia PDF Downloads 9828580 Apricot (Prunus armeniaca L.) Fruit Quality: Phytochemical Attributes of Some Apricot Cultivars as Affected by Genotype and Ripening
Authors: Jamal Ayour, Mohamed Benichou
Abstract:
Fruit quality is one of the main concerns of consumers, producers, and distributors. The evolution of apricot fruits undergoes a strong acceleration during maturation, and the rapidity of post-harvest evolution of the ripe fruit is particularly selective in the apricot. The objective of this study is to identify new cultivars with an interesting quality as well as a better yield allowing a more prolonged production over time. The evaluation of the fruit quality of new apricot cultivars from the Marrakech region was carried out by analyzing their physical and biochemical attributes during ripening. The results obtained clearly show a great diversity of the physicochemical attributes of the selected clones. Also, some genotypes of apricots showed contents of sugars, organic acids (vitamin C) and β carotene significantly higher than those of the most famous varieties of Morocco: Canino, Delpatriarca, and Maoui. Principal component analysis (PCA), taking into account the maturity stage and the diversity of cultivars, made it possible to define three groups with similar physicochemical attributes. The results of this study are of great use, particularly for the selection of genotypes with a better quality of fruit, both for consumption or industrial processing and with important contents of physicochemical attributes.Keywords: apricot, acidity, carotenoids, color, sugar, quality, vitamin C
Procedia PDF Downloads 32628579 Improving Taint Analysis of Android Applications Using Finite State Machines
Authors: Assad Maalouf, Lunjin Lu, James Lynott
Abstract:
We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.Keywords: android, static analysis, string analysis, taint analysis
Procedia PDF Downloads 18228578 Modeling of Leaks Effects on Transient Dispersed Bubbly Flow
Authors: Mohand Kessal, Rachid Boucetta, Mourad Tikobaini, Mohammed Zamoum
Abstract:
Leakage problem of two-component fluids flow is modeled for a transient one-dimensional homogeneous bubbly flow and developed by taking into account the effect of a leak located at the middle point of the pipeline. The corresponding three conservation equations are numerically resolved by an improved characteristic method. The obtained results are explained and commented in terms of physical impact on the flow parameters.Keywords: fluid transients, pipelines leaks, method of characteristics, leakage problem
Procedia PDF Downloads 48028577 The Contemporary Visual Spectacle: Critical Visual Literacy
Authors: Lai-Fen Yang
Abstract:
In this increasingly visual world, how can we best decipher and understand the many ways that our everyday lives are organized around looking practices and the many images we encounter each day? Indeed, how we interact with and interpret visual images is a basic component of human life. Today, however, we are living in one of the most artificial visual and image-saturated cultures in human history, which makes understanding the complex construction and multiple social functions of visual imagery more important than ever before. Themes regarding our experience of a visually pervasive mediated culture, here, termed visual spectacle.Keywords: visual culture, contemporary, images, literacy
Procedia PDF Downloads 51428576 Psychological Factors of Readiness of Defectologists to Professional Development: On the Example of Choosing an Educational Environment
Authors: Inna V. Krotova
Abstract:
The study pays special attention to the definition of the psychological potential of a specialist-defectologist, which determines his desire to increase the level of his or her professional competence. The group included participants of the educational environment – an additional professional program 'Technologies of psychological and pedagogical assistance for children with complex developmental disabilities' implemented by the department of defectology and clinical psychology of the KFU jointly with the Support Fund for the Deafblind people 'Co-Unity'. The purpose of our study was to identify the psychological aspects of the readiness of the specialist-defectologist to his or her professional development. The study assessed the indicators of psychological preparedness, and its four components were taken into account: motivational, cognitive, emotional and volitional. We used valid and standardized tests during the study. As a result of the factor analysis of data received (from Extraction Method: Principal Component Analysis, Rotation Method: Varimax with Kaiser Normalization, Rotation converged in 12 iterations), there were identified three factors with maximum factor load from 24 indices, and their correlation coefficients with other indicators were taken into account at the level of reliability p ≤ 0.001 and p ≤ 0.01. Thus the system making factor was determined – it’s a 'motivation to achieve success'; it formed a correlation galaxy with two other factors: 'general internality' and 'internality in the field of achievements', as well as with such psychological indicators as 'internality in the field of family relations', 'internality in the field of interpersonal relations 'and 'low self-control-high self-control' (the names of the scales used is the same as names in the analysis methods. In conclusion of the article, we present some proposals to take into account the psychological model of readiness of specialists-defectologists for their professional development, to stimulate the growth of their professional competence. The study has practical value for all providers of special education and organizations that have their own specialists-defectologists, teachers-defectologists, teachers for correctional and ergotherapeutic activities, specialists working in the field of correctional-pedagogical activity (speech therapists) to people with special needs who need true professional support.Keywords: psychological readiness, defectologist, professional development, psychological factors, special education, professional competence, innovative educational environment
Procedia PDF Downloads 17528575 A Comparison of Biosorption of Radionuclides Tl-201 on Different Biosorbents and Their Empirical Modelling
Authors: Sinan Yapici, Hayrettin Eroglu
Abstract:
The discharge of the aqueous radionuclides wastes used for the diagnoses of diseases and treatments of patients in nuclear medicine can cause fatal health problems when the radionuclides and its stable daughter component mix with underground water. Tl-201, which is one of the radionuclides commonly used in the nuclear medicine, is a toxic substance and is converted to its stable daughter component Hg-201, which is also a poisonous heavy metal: Tl201 → Hg201 + Gamma Ray [135-167 Kev (12%)] + X Ray [69-83 Kev (88%)]; t1/2 = 73,1 h. The purpose of the present work was to remove Tl-201 radionuclides from aqueous solution by biosorption on the solid bio wastes of food and cosmetic industry as bio sorbents of prina from an olive oil plant, rose residue from a rose oil plant and tea residue from a tea plant, and to make a comparison of the biosorption efficiencies. The effects of the biosorption temperature, initial pH of the aqueous solution, bio sorbent dose, particle size and stirring speed on the biosorption yield were investigated in a batch process. It was observed that the biosorption is a rapid process with an equilibrium time less than 10 minutes for all the bio sorbents. The efficiencies were found to be close to each other and measured maximum efficiencies were 93,30 percent for rose residue, 94,1 for prina and 98,4 for tea residue. In a temperature range of 283 and 313 K, the adsorption decreased with increasing temperature almost in a similar way. In a pH range of 2-10, increasing pH enhanced biosorption efficiency up to pH=7 and then the efficiency remained constant in a similar path for all the biosorbents. Increasing stirring speed from 360 to 720 rpm enhanced slightly the biosorption efficiency almost at the same ratio for all bio sorbents. Increasing particle size decreased the efficiency for all biosorbent; however the most negatively effected biosorbent was prina with a decrease in biosorption efficiency from about 84 percent to 40 with an increase in the nominal particle size 0,181 mm to 1,05 while the least effected one, tea residue, went down from about 97 percent to 87,5. The biosorption efficiencies of all the bio sorbents increased with increasing biosorbent dose in the range of 1,5 to 15,0 g/L in a similar manner. The fit of the experimental results to the adsorption isotherms proved that the biosorption process for all the bio sorbents can be represented best by Freundlich model. The kinetic analysis showed that all the processes fit very well to pseudo second order rate model. The thermodynamics calculations gave ∆G values between -8636 J mol-1 and -5378 for tea residue, -5313 and -3343 for rose residue, and -5701 and -3642 for prina with a ∆H values of -39516 J mol-1, -23660 and -26190, and ∆S values of -108.8 J mol-1 K-1, -64,0, -72,0 respectively, showing spontaneous and exothermic character of the processes. An empirical biosorption model in the following form was derived for each biosorbent as function of the parameters and time, taking into account the form of kinetic model, with regression coefficients over 0.9990 where At is biosorbtion efficiency at any time and Ae is the equilibrium efficiency, t is adsorption period as s, ko a constant, pH the initial acidity of biosorption medium, w the stirring speed as s-1, S the biosorbent dose as g L-1, D the particle size as m, and a, b, c, and e are the powers of the parameters, respectively, E a constant containing activation energy and T the temperature as K.Keywords: radiation, diosorption, thallium, empirical modelling
Procedia PDF Downloads 26528574 The Impact of Dust Storm Events on the Chemical and Toxicological Characteristics of Ambient Particulate Matter in Riyadh, Saudi Arabia
Authors: Abdulmalik Altuwayjiri, Milad Pirhadi, Mohammed Kalafy, Badr Alharbi, Constantinos Sioutas
Abstract:
In this study, we investigated the chemical and toxicological characteristics of PM10 in the metropolitan area of Riyadh, Saudi Arabia. PM10 samples were collected on quartz and teflon filters during cold (December 2019–April 2020) and warm (May 2020–August 2020) seasons, including dust and non-dust events. The PM10 constituents were chemically analyzed for their metal, inorganic ions, and elemental and organic carbon (EC/OC) contents. Additionally, the PM10 oxidative potential was measured by means of the dithiothreitol (DTT) assay. Our findings revealed that the oxidative potential of the collected ambient PM10 samples was significantly higher than those measured in many urban areas worldwide. The oxidative potential of the collected ambient PM¹⁰⁻ samples was also higher during dust episodes compared to non-dust events, mainly due to higher concentrations of metals during these events. We performed Pearson correlation analysis, principal component analysis (PCA), and multi-linear regression (MLR) to identify the most significant sources contributing to the toxicity of PM¹⁰⁻ The results of the MLR analyses indicated that the major pollution sources contributing to the oxidative potential of ambient PM10 were soil and resuspended dust emissions (identified by Al, K, Fe, and Li) (31%), followed by secondary organic aerosol (SOA) formation (traced by SO₄-² and NH+₄) (20%), and industrial activities (identified by Se and La) (19%), and traffic emissions (characterized by EC, Zn, and Cu) (17%). Results from this study underscore the impact of transported dust emissions on the oxidative potential of ambient PM10 in Riyadh and can be helpful in adopting appropriate public health policies regarding detrimental outcomes of exposure to PM₁₀-Keywords: ambient PM10, oxidative potential, source apportionment, Riyadh, dust episodes
Procedia PDF Downloads 17428573 Grand Paris Residential Real Estate as an Effective Hedge against Inflation
Authors: Yasmine Essafi Zouari, Aya Nasreddine
Abstract:
Following a long inflationary period from the post-war era to the mid-1980s (+10.1% annually), France went through a moderate inflation period between 1986 and 2001 (+2.1% annually) and even lower inflation between 2002 and 2016 (+1.4% annually). In 2022, inflation in France increased rapidly and reached 4.5% over one year in March, according to INSEE estimates. Over a long period, even low inflation has an impact on portfolio value and households’ purchasing power. In such a context, inflation hedging should remain an important issue for investors. In particular, long-term investors, who are concerned with the protection of their wealth, seek to hold effective hedging assets. Considering a mixed-asset portfolio composed of housing assets (residential real estate in 150 Grand Paris communes) as well as financial assets, and using both correlation and regression analysis, results confirm the attribute of the direct housing investment as an inflation hedge especially particularly against its unexpected component. Further, cash and bonds were found to provide respectively a partial and an over hedge against unexpected inflation. Stocks act as a perverse hedge against unexpected inflation and provide no significant positive hedge against expected inflation.Keywords: direct housing, inflation, hedging ability, optimal portfolio, Grand Paris metropolis
Procedia PDF Downloads 11528572 Seismic Assessment of Non-Structural Component Using Floor Design Spectrum
Authors: Amin Asgarian, Ghyslaine McClure
Abstract:
Experiences in the past earthquakes have clearly demonstrated the necessity of seismic design and assessment of Non-Structural Components (NSCs) particularly in post-disaster structures such as hospitals, power plants, etc. as they have to be permanently functional and operational. Meeting this objective is contingent upon having proper seismic performance of both structural and non-structural components. Proper seismic design, analysis, and assessment of NSCs can be attained through generation of Floor Design Spectrum (FDS) in a similar fashion as target spectrum for structural components. This paper presents the developed methodology to generate FDS directly from corresponding Uniform Hazard Spectrum (UHS) (i.e. design spectra for structural components). The methodology is based on the experimental and numerical analysis of a database of 27 real Reinforced Concrete (RC) buildings which are located in Montreal, Canada. The buildings were tested by Ambient Vibration Measurements (AVM) and their dynamic properties have been extracted and used as part of the approach. Database comprises 12 low-rises, 10 medium-rises, and 5 high-rises and they are mostly designated as post-disaster\emergency shelters by the city of Montreal. The buildings are subjected to 20 compatible seismic records to UHS of Montreal and Floor Response Spectra (FRS) are developed for every floors in two horizontal direction considering four different damping ratios of NSCs (i.e. 2, 5, 10, and 20 % viscous damping). Generated FRS (approximately 132’000 curves) are statistically studied and the methodology is proposed to generate the FDS directly from corresponding UHS. The approach is capable of generating the FDS for any selection of floor level and damping ratio of NSCs. It captures the effect of: dynamic interaction between primary (structural) and secondary (NSCs) systems, higher and torsional modes of primary structure. These are important improvements of this approach compared to conventional methods and code recommendations. Application of the proposed approach are represented here through two real case-study buildings: one low-rise building and one medium-rise. The proposed approach can be used as practical and robust tool for seismic assessment and design of NSCs especially in existing post-disaster structures.Keywords: earthquake engineering, operational and functional components, operational modal analysis, seismic assessment and design
Procedia PDF Downloads 21328571 Simplified 3R2C Building Thermal Network Model: A Case Study
Authors: S. M. Mahbobur Rahman
Abstract:
Whole building energy simulation models are widely used for predicting future energy consumption, performance diagnosis and optimum control. Black box building energy modeling approach has been heavily studied in the past decade. The thermal response of a building can also be modeled using a network of interconnected resistors (R) and capacitors (C) at each node called R-C network. In this study, a model building, Case 600, as described in the “Standard Method of Test for the Evaluation of Building Energy Analysis Computer Program”, ASHRAE standard 140, is studied along with a 3R2C thermal network model and the ASHRAE clear sky solar radiation model. Although building an energy model involves two important parts of building component i.e., the envelope and internal mass, the effect of building internal mass is not considered in this study. All the characteristic parameters of the building envelope are evaluated as on Case 600. Finally, monthly building energy consumption from the thermal network model is compared with a simple-box energy model within reasonable accuracy. From the results, 0.6-9.4% variation of monthly energy consumption is observed because of the south-facing windows.Keywords: ASHRAE case study, clear sky solar radiation model, energy modeling, thermal network model
Procedia PDF Downloads 14628570 The Nexus between Child Marriage and Women Empowerment with Physical Violence in Two Culturally Distinct States of India
Authors: Jayakant Singh, Enu Anand
Abstract:
Background: Child marriage is widely prevalent in India. It is a form of gross human right violation that succumbs a child bride to be subservient to her husband within a marital relation. We investigated the relationship between age at marriage of women and her level of empowerment with physical violence experienced 12 months preceding the survey among young women aged 20-24 in two culturally distinct states- Bihar and Tamil Nadu of India. Methods: We used the information collected from 10514 young married women (20-24 years) at all India level, 373 in Bihar and 523 in Tamil Nadu from the third round of National Family Health Survey. Empowerment index was calculated using different parameters such as mobility, economic independence and decision making power of women using Principal Component Analysis method. Bivariate analysis was performed primarily using chi square for the test of significance. Logistic regression was carried out to assess the effect of age at marriage and empowerment on physical violence. Results: Lower level of women empowerment was significantly associated with physical violence in Tamil Nadu (OR=2.38, p<0.01) whereas child marriage (marriage before age 15) was associated with physical violence in Bihar (OR=3.27, p<0.001). The mean difference in age at marriage between those who experienced physical violence and those who did not experience varied by 7 months in Bihar and 10 months in Tamil Nadu. Conclusion: Culture specific intervention may be a key to reduction of violence against women as the results showed association of different factors contributing to physical violence in Bihar and Tamil Nadu. Marrying at an appropriate age perhaps is protective of abuse because it equips a woman to assert her rights effectively. It calls for an urgent consideration to curb both violence and child marriage with stricter involvement of family, civil society and the government. In the meanwhile physical violence may be recognized as a public health problem and integrate appropriate treatment to the victims within the health care institution.Keywords: child marriage, empowerment, India, physical violence
Procedia PDF Downloads 31328569 The Documentary Analysis of Meta-Analysis Research in Violence of Media
Authors: Proud Arunrangsiwed
Abstract:
The part of “future direction” in the findings of meta-analysis could provide the great direction to conduct the future studies. This study, “The Documentary Analysis of Meta-Analysis Research in Violence of Media” would conclude “future directions” out of 10 meta-analysis papers. The purposes of this research are to find an appropriate research design or an appropriate methodology for the future research related to the topic, “violence of media”. Further research needs to explore by longitudinal and experimental design, and also needs to have a careful consideration about age effects, time spent effects, enjoyment effects, and ordinary lifestyle of each media consumer.Keywords: aggressive, future direction, meta-analysis, media, violence
Procedia PDF Downloads 41228568 Data Transformations in Data Envelopment Analysis
Authors: Mansour Mohammadpour
Abstract:
Data transformation refers to the modification of any point in a data set by a mathematical function. When applying transformations, the measurement scale of the data is modified. Data transformations are commonly employed to turn data into the appropriate form, which can serve various functions in the quantitative analysis of the data. This study addresses the investigation of the use of data transformations in Data Envelopment Analysis (DEA). Although data transformations are important options for analysis, they do fundamentally alter the nature of the variable, making the interpretation of the results somewhat more complex.Keywords: data transformation, data envelopment analysis, undesirable data, negative data
Procedia PDF Downloads 2428567 The Application of Creative Economy in National R&D Programs of Health Technology (HT) Area in Korea
Authors: Hong Bum Kim
Abstract:
Health technology (HT) area have high growth potential because of global trends such as ageing and economical development. For its high employment effect and capability for creating new business, HT is being considered as one of the major next-generation growth power. Particularly, convergence technologies which are emerged by fusion of HT and other technological area is emphasized for new industry creation in Korea, as a part of Creative Economy. In this study, current status of HT area in Korea is analyzed. The aspect of transition in emphasized technological area of HT-related national R&D enterprise is statistically reviewed. Current level of HT-related technologies such as BT, IT and NT is investigated in this context. Existing research system for HT-convergence technology development such as establishment of research center is also analyzed. Finally, proposed research support system such as system of legislation for developing HT area as one of the main component of Creative Economy in Korea will be analyzed. Analysis of technology trend and policy will help to draw a new direction in progression of R&D enterprise in HT area. Improvement of policy such as legal system reorganization and measure of social agreement for burden of expense could be deduced based on these results.Keywords: HT, creative economy, policy, national R&D programs
Procedia PDF Downloads 38928566 Liquefaction Potential Assessment Using Screw Driving Testing and Microtremor Data: A Case Study in the Philippines
Authors: Arturo Daag
Abstract:
The Philippine Institute of Volcanology and Seismology (PHIVOLCS) is enhancing its liquefaction hazard map towards a detailed probabilistic approach using SDS and geophysical data. Target sites for liquefaction assessment are public schools in Metro Manila. Since target sites are in highly urbanized-setting, the objective of the project is to conduct both non-destructive geotechnical studies using Screw Driving Testing (SDFS) combined with geophysical data such as refraction microtremor array (ReMi), 3 component microtremor Horizontal to Vertical Spectral Ratio (HVSR), and ground penetrating RADAR (GPR). Initial test data was conducted in liquefaction impacted areas from the Mw 6.1 earthquake in Central Luzon last April 22, 2019 Province of Pampanga. Numerous accounts of liquefaction events were documented areas underlain by quaternary alluvium and mostly covered by recent lahar deposits. SDS estimated values showed a good correlation to actual SPT values obtained from available borehole data. Thus, confirming that SDS can be an alternative tool for liquefaction assessment and more efficient in terms of cost and time compared to SPT and CPT. Conducting borehole may limit its access in highly urbanized areas. In order to extend or extrapolate the SPT borehole data, non-destructive geophysical equipment was used. A 3-component microtremor obtains a subsurface velocity model in 1-D seismic shear wave velocity of the upper 30 meters of the profile (Vs30). For the ReMi, 12 geophone array with 6 to 8-meter spacing surveys were conducted. Microtremor data were computed through the Factor of Safety, which is the quotient of Cyclic Resistance Ratio (CRR) and Cyclic Stress Ratio (CSR). Complementary GPR was used to study the subsurface structure and used to inferred subsurface structures and groundwater conditions.Keywords: screw drive testing, microtremor, ground penetrating RADAR, liquefaction
Procedia PDF Downloads 20328565 Achieving Product Robustness through Variation Simulation: An Industrial Case Study
Authors: Narendra Akhadkar, Philippe Delcambre
Abstract:
In power protection and control products, assembly process variations due to the individual parts manufactured from single or multi-cavity tooling is a major problem. The dimensional and geometrical variations on the individual parts, in the form of manufacturing tolerances and assembly tolerances, are sources of clearance in the kinematic joints, polarization effect in the joints, and tolerance stack-up. All these variations adversely affect the quality of product, functionality, cost, and time-to-market. Variation simulation analysis may be used in the early product design stage to predict such uncertainties. Usually, variations exist in both manufacturing processes and materials. In the tolerance analysis, the effect of the dimensional and geometrical variations of the individual parts on the functional characteristics (conditions) of the final assembled products are studied. A functional characteristic of the product may be affected by a set of interrelated dimensions (functional parameters) that usually form a geometrical closure in a 3D chain. In power protection and control products, the prerequisite is: when a fault occurs in the electrical network, the product must respond quickly to react and break the circuit to clear the fault. Usually, the response time is in milliseconds. Any failure in clearing the fault may result in severe damage to the equipment or network, and human safety is at stake. In this article, we have investigated two important functional characteristics that are associated with the robust performance of the product. It is demonstrated that the experimental data obtained at the Schneider Electric Laboratory prove the very good prediction capabilities of the variation simulation performed using CETOL (tolerance analysis software) in an industrial context. Especially, this study allows design engineers to better understand the critical parts in the product that needs to be manufactured with good, capable tolerances. On the contrary, some parts are not critical for the functional characteristics (conditions) of the product and may lead to some reduction of the manufacturing cost, ensuring robust performance. The capable tolerancing is one of the most important aspects in product and manufacturing process design. In the case of miniature circuit breaker (MCB), the product's quality and its robustness are mainly impacted by two aspects: (1) allocation of design tolerances between the components of a mechanical assembly and (2) manufacturing tolerances in the intermediate machining steps of component fabrication.Keywords: geometrical variation, product robustness, tolerance analysis, variation simulation
Procedia PDF Downloads 16428564 A Comparative Study of Adjustment Problems of Freshmen and Senior Year Students
Authors: Shimony Agrawal
Abstract:
In this continually evolving world, change is the most imperative component of our identity. The term alteration alludes to degree by which an individual adapts to inward strains, needs, clashes and can bring coordination between his internal requests and those forced by the external world. Adjustment is a way of managing various demands of life. . Entering school is a defining moment for school first year recruits in their adulthood. The progress from school to school can be rationally and in addition physically troubling. Students deal with a unique amount of stressors when they enter college. Introductory months of school are loaded with apprehension and attempting to fit in the new condition. Colleges and schools should ensure their understudies are balanced in the new condition by giving help at whatever point vital.. The main objective of the study was a comparative analysis of adjustment level with respect to overall adjustment level, gender and living environment. This research has been conducted using Adjustment Inventory for College Students (AICS). The total population is comprised of 240 college-going students. The data majority of the population scored poorly on Emotional Adjustment. Also, female students faced more adjustment problems as compared to male students. However, no significant change was noticed in living environment of the students.Keywords: adjustment, college students, freshmen year, senior year
Procedia PDF Downloads 26028563 Considering Partially Developed Artifacts in Change Impact Analysis Implementation
Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim
Abstract:
It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.Keywords: software development, impact analysis, traceability, static analysis.
Procedia PDF Downloads 60828562 Analysis of Public Space Usage Characteristics Based on Computer Vision Technology - Taking Shaping Park as an Example
Authors: Guantao Bai
Abstract:
Public space is an indispensable and important component of the urban built environment. How to more accurately evaluate the usage characteristics of public space can help improve its spatial quality. Compared to traditional survey methods, computer vision technology based on deep learning has advantages such as dynamic observation and low cost. This study takes the public space of Shaping Park as an example and, based on deep learning computer vision technology, processes and analyzes the image data of the public space to obtain the spatial usage characteristics and spatiotemporal characteristics of the public space. Research has found that the spontaneous activity time in public spaces is relatively random with a relatively short average activity time, while social activities have a relatively stable activity time with a longer average activity time. Computer vision technology based on deep learning can effectively describe the spatial usage characteristics of the research area, making up for the shortcomings of traditional research methods and providing relevant support for creating a good public space.Keywords: computer vision, deep learning, public spaces, using features
Procedia PDF Downloads 7228561 Determination of the Volatile Organic Compounds, Antioxidant and Antimicrobial Properties of Microwave-Assisted Green Extracted Ficus Carica Linn Leaves
Authors: Pelin Yilmaz, Gizemnur Yildiz Uysal, Elcin Demirhan, Belma Ozbek
Abstract:
The edible fig plant, Ficus carica Linn, belongs to the Moraceae family, and the leaves are mainly considered agricultural waste after harvesting. It has been demonstrated in the literature that fig leaves contain appealing properties such as high vitamins, fiber, amino acids, organic acids, and phenolic or flavonoid content. The extraction of these valuable products has gained importance. Microwave-assisted extraction (MAE) is a method using microwave energy to heat the solvents, thereby transferring the bioactive compounds from the sample to the solvent. The main advantage of the MAE is the rapid extraction of bioactive compounds. In the present study, the MAE was applied to extract the bioactive compounds from Ficus carica L. leaves, and the effect of microwave power (180-900 W), extraction time (60-180 s), and solvent to sample amount (mL/g) (10-30) on the antioxidant property of the leaves. Then, the volatile organic component profile was determined at the specified extraction point. Additionally, antimicrobial studies were carried out to determine the minimum inhibitory concentration of the microwave-extracted leaves. As a result, according to the data obtained from the experimental studies, the highest antimicrobial properties were obtained under the process parameters such as 540 W, 180 s, and 20 mL/g concentration. The volatile organic compound profile showed that isobergapten, which belongs to the furanocoumarins family exhibiting anticancer, antioxidant, and antimicrobial activity besides promoting bone health, was the main compound. Acknowledgments: This work has been supported by Yildiz Technical University Scientific Research Projects Coordination Unit under project number FBA-2021-4409. The authors would like to acknowledge the financial support from Tubitak 1515 - Frontier R&D Laboratory Support Programme.Keywords: Ficus carica Linn leaves, volatile organic component, GC-MS, microwave extraction, isobergapten, antimicrobial
Procedia PDF Downloads 8228560 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state
Procedia PDF Downloads 266