Search results for: thin film processing
1458 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 1231457 Improving Perceptual Reasoning in School Children through Chess Training
Authors: Ebenezer Joseph, Veena Easvaradoss, S. Sundar Manoharan, David Chandran, Sumathi Chandrasekaran, T. R. Uma
Abstract:
Perceptual reasoning is the ability that incorporates fluid reasoning, spatial processing, and visual motor integration. Several theories of cognitive functioning emphasize the importance of fluid reasoning. The ability to manipulate abstractions and rules and to generalize is required for reasoning tasks. This study, funded by the Cognitive Science Research Initiative, Department of Science and Technology, Government of India, analyzed the effect of 1-year chess training on the perceptual reasoning of children. A pretest–posttest with control group design was used, with 43 (28 boys, 15 girls) children in the experimental group and 42 (26 boys, 16 girls) children in the control group. The sample was selected from children studying in two private schools from South India (grades 3 to 9), which included both the genders. The experimental group underwent weekly 1-hour chess training for 1 year. Perceptual reasoning was measured by three subtests of WISC-IV INDIA. Pre-equivalence of means was established. Further statistical analyses revealed that the experimental group had shown statistically significant improvement in perceptual reasoning compared to the control group. The present study clearly establishes a correlation between chess learning and perceptual reasoning. If perceptual reasoning can be enhanced in children, it could possibly result in the improvement of executive functions as well as the scholastic performance of the child.Keywords: chess, cognition, intelligence, perceptual reasoning
Procedia PDF Downloads 3571456 A Life Cycle Assessment (LCA) of Aluminum Production Process
Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour
Abstract:
The production of aluminium alloys and ingots -starting from the processing of alumina to aluminium, and the final cast product- was studied using a Life Cycle Assessment (LCA) approach. The studied aluminium supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminium metal were investigated. The impact of the aluminium production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it comes to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.Keywords: life cycle assessment, aluminium production, supply chain, ecological impacts
Procedia PDF Downloads 5321455 Random Forest Classification for Population Segmentation
Authors: Regina Chua
Abstract:
To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling
Procedia PDF Downloads 941454 Production and Characterization of Ce3+: Si2N2O Phosphors for White Light-Emitting Diodes
Authors: Alparslan A. Balta, Hilmi Yurdakul, Orkun Tunckan, Servet Turan, Arife Yurdakul
Abstract:
Si2N2O (Sinoite) is an inorganic-based oxynitride material that reveals promising phosphor candidates for white light-emitting diodes (WLEDs). However, there is now limited knowledge to explain the synthesis of Si2N2O for this purpose. Here, to the best of authors’ knowledge, we report the first time the production of Si2N2O based phosphors by CeO2, SiO2, Si3N4 from main starting powders, and Li2O sintering additive through spark plasma sintering (SPS) route. The processing parameters, e.g., pressure, temperature, and sintering time, were optimized to reach the monophase Si2N2O containing samples. The lattice parameter, crystallite size, and amount of formation phases were characterized in detail by X-ray diffraction (XRD). Grain morphology, particle size, and distribution were analyzed by scanning and transmission electron microscopes (SEM and TEM). Cathodoluminescence (CL) in SEM and photoluminescence (PL) analyses were conducted on the samples to determine the excitation, and emission characteristics of Ce3+ activated Si2N2O. Results showed that the Si2N2O phase in a maximum 90% ratio was obtained by sintering for 15 minutes at 1650oC under 30 MPa pressure. Based on the SEM-CL and PL measurements, Ce3+: Si2N2O phosphor shows a broad emission summit between 400-700 nm that corresponds to white light. The present research was supported by TUBITAK under project number 217M667.Keywords: cerium, oxynitride, phosphors, sinoite, Si₂N₂O
Procedia PDF Downloads 1081453 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 1401452 New Findings on the Plasma Electrolytic Oxidation (PEO) of Aluminium
Authors: J. Martin, A. Nominé, T. Czerwiec, G. Henrion, T. Belmonte
Abstract:
The plasma electrolytic oxidation (PEO) is a particular electrochemical process to produce protective oxide ceramic coatings on light-weight metals (Al, Mg, Ti). When applied to aluminum alloys, the resulting PEO coating exhibit improved wear and corrosion resistance because thick, hard, compact and adherent crystalline alumina layers can be achieved. Several investigations have been carried out to improve the efficiency of the PEO process and one particular way consists in tuning the suitable electrical regime. Despite the considerable interest in this process, there is still no clear understanding of the underlying discharge mechanisms that make possible metal oxidation up to hundreds of µm through the ceramic layer. A key parameter that governs the PEO process is the numerous short-lived micro-discharges (micro-plasma in liquid) that occur continuously over the processed surface when the high applied voltage exceeds the critical dielectric breakdown value of the growing ceramic layer. By using a bipolar pulsed current to supply the electrodes, we previously observed that micro-discharges are delayed with respect to the rising edge of the anodic current. Nevertheless, explanation of the origin of such phenomena is still not clear and needs more systematic investigations. The aim of the present communication is to identify the relationship that exists between this delay and the mechanisms responsible of the oxide growth. For this purpose, the delay of micro-discharges ignition is investigated as the function of various electrical parameters such as the current density (J), the current pulse frequency (F) and the anodic to cathodic charge quantity ratio (R = Qp/Qn) delivered to the electrodes. The PEO process was conducted on Al2214 aluminum alloy substrates in a solution containing potassium hydroxide [KOH] and sodium silicate diluted in deionized water. The light emitted from micro-discharges was detected by a photomultiplier and the micro-discharge parameters (number, size, life-time) were measured during the process by means of ultra-fast video imaging (125 kfr./s). SEM observations and roughness measurements were performed to characterize the morphology of the elaborated oxide coatings while XRD was carried out to evaluate the amount of corundum -Al203 phase. Results show that whatever the applied current waveform, the delay of micro-discharge appearance increases as the process goes on. Moreover, the delay is shorter when the current density J (A/dm2), the current pulse frequency F (Hz) and the ratio of charge quantity R are high. It also appears that shorter delays are associated to stronger micro-discharges (localized, long and large micro-discharges) which have a detrimental effect on the elaborated oxide layers (thin and porous). On the basis of the results, a model for the growth of the PEO oxide layers will be presented and discussed. Experimental results support that a mechanism of electrical charge accumulation at the oxide surface / electrolyte interface takes place until the dielectric breakdown occurs and thus until micro-discharges appear.Keywords: aluminium, micro-discharges, oxidation mechanisms, plasma electrolytic oxidation
Procedia PDF Downloads 2641451 Segmentation of Liver Using Random Forest Classifier
Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir
Abstract:
Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.Keywords: CT images, image validation, random forest, segmentation
Procedia PDF Downloads 3131450 Bioavailability of Iron in Some Selected Fiji Foods using In vitro Technique
Authors: Poonam Singh, Surendra Prasad, William Aalbersberg
Abstract:
Iron the most essential trace element in human nutrition. Its deficiency has serious health consequences and is a major public health threat worldwide. The common deficiencies in Fiji population reported are of Fe, Ca and Zn. It has also been reported that 40% of women in Fiji are iron deficient. Therefore, we have been studying the bioavailability of iron in commonly consumed Fiji foods. To study the bioavailability it is essential to assess the iron contents in raw foods. This paper reports the iron contents and its bioavailability in commonly consumed foods by multicultural population of Fiji. The food samples (rice, breads, wheat flour and breakfast cereals) were analyzed by atomic absorption spectrophotometer for total iron and its bioavailability. The white rice had the lowest total iron 0.10±0.03 mg/100g but had high bioavailability of 160.60±0.03%. The brown rice had 0.20±0.03 mg/100g total iron content but 85.00±0.03% bioavailable. The white and brown breads showed the highest iron bioavailability as 428.30±0.11 and 269.35 ±0.02%, respectively. The Weetabix and the rolled oats had the iron contents 2.89±0.27 and 1.24.±0.03 mg/100g with bioavailability of 14.19±0.04 and 12.10±0.03%, respectively. The most commonly consumed normal wheat flour had 0.65±0.00 mg/100g iron while the whole meal and the Roti flours had 2.35±0.20 and 0.62±0.17 mg/100g iron showing bioavailability of 55.38±0.05, 16.67±0.08 and 12.90±0.00%, respectively. The low bioavailability of iron in certain foods may be due to the presence of phytates/oxalates, processing/storage conditions, cooking method or interaction with other minerals present in the food samples.Keywords: iron, bioavailability, Fiji foods, in vitro technique, human nutrition
Procedia PDF Downloads 5291449 Functional Gene Expression in Human Cells Using Linear Vectors Derived from Bacteriophage N15 Processing
Authors: Kumaran Narayanan, Pei-Sheng Liew
Abstract:
This paper adapts the bacteriophage N15 protelomerase enzyme to assemble linear chromosomes as vectors for gene expression in human cells. Phage N15 has the unique ability to replicate as a linear plasmid with telomeres in E. coli during its prophage stage of life-cycle. The virus-encoded protelomerase enzyme cuts its circular genome and caps its ends to form hairpin telomeres, resulting in a linear human-chromosome-like structure in E. coli. In mammalian cells, however, no enzyme with TelN-like activities has been found. In this work, we show for the first-time transfer of the protelomerase from phage into human and mouse cells and demonstrate recapitulation of its activity in these hosts. The function of this enzyme is assayed by demonstrating cleavage of its target DNA, followed by detecting telomere formation based on its resistance to recBCD enzyme digestion. We show protelomerase expression persists for at least 60 days, which indicates limited silencing of its expression. Next, we show that an intact human β-globin gene delivered on this linear chromosome accurately retains its expression in the human cellular environment for at least 60 hours, demonstrating its stability and potential as a vector. These results demonstrate that the N15 protelomerse is able to function in mammalian cells to cut and heal DNA to create telomeres, which provides a new tool for creating novel structures by DNA resolution in these hosts.Keywords: chromosome, beta-globin, DNA, gene expression, linear vector
Procedia PDF Downloads 1921448 Influence of κ-Casein Genotype on Milk Productivity of Latvia Local Dairy Breeds
Authors: S. Petrovska, D. Jonkus, D. Smiltiņa
Abstract:
κ-casein is one of milk proteins which are very important for milk processing. Genotypes of κ-casein affect milk yield, fat, and protein content. The main factors which affect local Latvian dairy breed milk yield and composition are analyzed in research. Data were collected from 88 Latvian brown and 82 Latvian blue cows in 2015. AA genotype was 0.557 in Latvian brown and 0.232 in Latvian blue breed. BB genotype was 0.034 in Latvian brown and 0.207 in Latvian blue breed. Highest milk yield was observed in Latvian brown (5131.2 ± 172.01 kg), significantly high fat content and fat yield also was in Latvian brown (p < 0.05). Significant differences between κ-casein genotypes were not found in Latvian brown, but highest milk yield (5057 ± 130.23 kg), protein content (3.42 ± 0.03%), and protein yield (171.9 ± 4.34 kg) were with AB genotype. Significantly high fat content was observed in Latvian blue breed with BB genotype (4.29 ± 0.17%) compared with AA genotypes (3.42 ± 0.19). Similar tendency was found in protein content – 3.27 ± 0.16% with BB genotype and 2.59 ± 0.16% with AA genotype (p < 0.05). Milk yield increases by increasing parity. We did not obtain major tendency of changes of milk fat and protein content according parity.Keywords: dairy cows, κ-casein, milk productivity, polymorphism
Procedia PDF Downloads 2701447 A Review on Medical Image Registration Techniques
Authors: Shadrack Mambo, Karim Djouani, Yskandar Hamam, Barend van Wyk, Patrick Siarry
Abstract:
This paper discusses the current trends in medical image registration techniques and addresses the need to provide a solid theoretical foundation for research endeavours. Methodological analysis and synthesis of quality literature was done, providing a platform for developing a good foundation for research study in this field which is crucial in understanding the existing levels of knowledge. Research on medical image registration techniques assists clinical and medical practitioners in diagnosis of tumours and lesion in anatomical organs, thereby enhancing fast and accurate curative treatment of patients. Literature review aims to provide a solid theoretical foundation for research endeavours in image registration techniques. Developing a solid foundation for a research study is possible through a methodological analysis and synthesis of existing contributions. Out of these considerations, the aim of this paper is to enhance the scientific community’s understanding of the current status of research in medical image registration techniques and also communicate to them, the contribution of this research in the field of image processing. The gaps identified in current techniques can be closed by use of artificial neural networks that form learning systems designed to minimise error function. The paper also suggests several areas of future research in the image registration.Keywords: image registration techniques, medical images, neural networks, optimisaztion, transformation
Procedia PDF Downloads 1781446 Density Determination of Liquid Niobium by Means of Ohmic Pulse-Heating for Critical Point Estimation
Authors: Matthias Leitner, Gernot Pottlacher
Abstract:
Experimental determination of critical point data like critical temperature, critical pressure, critical volume and critical compressibility of high-melting metals such as niobium is very rare due to the outstanding experimental difficulties in reaching the necessary extreme temperature and pressure regimes. Experimental techniques to achieve such extreme conditions could be diamond anvil devices, two stage gas guns or metal samples hit by explosively accelerated flyers. Electrical pulse-heating under increased pressures would be another choice. This technique heats thin wire samples of 0.5 mm diameter and 40 mm length from room temperature to melting and then further to the end of the stable phase, the spinodal line, within several microseconds. When crossing the spinodal line, the sample explodes and reaches the gaseous phase. In our laboratory, pulse-heating experiments can be performed under variation of the ambient pressure from 1 to 5000 bar and allow a direct determination of critical point data for low-melting, but not for high-melting metals. However, the critical point also can be estimated by extrapolating the liquid phase density according to theoretical models. A reasonable prerequisite for the extrapolation is the existence of data that cover as much as possible of the liquid phase and at the same time exhibit small uncertainties. Ohmic pulse-heating was therefore applied to determine thermal volume expansion, and from that density of niobium over the entire liquid phase. As a first step, experiments under ambient pressure were performed. The second step will be to perform experiments under high-pressure conditions. During the heating process, shadow images of the expanding sample wire were captured at a frame rate of 4 × 105 fps to monitor the radial expansion as a function of time. Simultaneously, the sample radiance was measured with a pyrometer operating at a mean effective wavelength of 652 nm. To increase the accuracy of temperature deduction, spectral emittance in the liquid phase is also taken into account. Due to the high heating rates of about 2 × 108 K/s, longitudinal expansion of the wire is inhibited which implies an increased radial expansion. As a consequence, measuring the temperature dependent radial expansion is sufficient to deduce density as a function of temperature. This is accomplished by evaluating the full widths at half maximum of the cup-shaped intensity profiles that are calculated from each shadow image of the expanding wire. Relating these diameters to the diameter obtained before the pulse-heating start, the temperature dependent volume expansion is calculated. With the help of the known room-temperature density, volume expansion is then converted into density data. The so-obtained liquid density behavior is compared to existing literature data and provides another independent source of experimental data. In this work, the newly determined off-critical liquid phase density was in a second step utilized as input data for the estimation of niobium’s critical point. The approach used, heuristically takes into account the crossover from mean field to Ising behavior, as well as the non-linearity of the phase diagram’s diameter.Keywords: critical point data, density, liquid metals, niobium, ohmic pulse-heating, volume expansion
Procedia PDF Downloads 2191445 Impact of Modifying the Surface Materials on the Radiative Heat Transfer Phenomenon
Authors: Arkadiusz Urzędowski, Dorota Wójcicka-Migasiuk, Andrzej Sachajdak, Magdalena Paśnikowska-Łukaszuk
Abstract:
Due to the impact of climate changes and inevitability to reduce greenhouse gases, the need to use low-carbon and sustainable construction has increased. In this work, it is investigated how texture of the surface building materials and radiative heat transfer phenomenon in flat multilayer can be correlated. Attempts to test the surface emissivity are taken however, the trustworthiness of measurement results remains a concern since sensor size and thickness are common problems. This paper presents an experimental method to studies surface emissivity with use self constructed thermal sensors and thermal imaging technique. The surface of building materials was modified by mechanical and chemical treatment affecting the reduction of the emissivity. For testing the shaping surface of materials and mapping its three-dimensional structure, scanning profilometry were used in a laboratory. By comparing the results of laboratory tests and performed analysis of 3D computer fluid dynamics software, it can be shown that a change in the surface coverage of materials affects the heat transport by radiation between layers. Motivated by recent advancements in variational inference, this publication evaluates the potential use a dedicated data processing approach, and properly constructed temperature sensors, the influence of the surface emissivity on the phenomenon of radiation and heat transport in the entire partition can be determined.Keywords: heat transfer, surface roughness, surface emissivity, radiation
Procedia PDF Downloads 971444 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being
Procedia PDF Downloads 701443 Evaluating Effectiveness of Training and Development Corporate Programs: The Russian Agribusiness Context
Authors: Ekaterina Tikhonova
Abstract:
This research is aimed to evaluate the effectiveness of T&D (Training and Development) on the example of two T&D programs for the Executive TOP Management run in 2012, 2015-2016 in Komos Group. This study is commissioned to research the effectiveness of two similar corporate T&D programs (within one company) in two periods of time (2012, 2015-2016) through evaluating the programs’ effectiveness using the four-level Kirkpatrick’s model of evaluating T&D programs and calculating ROI as an instrument for T&D program measuring by Phillips’ formula. The research investigates the correlation of two figures: the ROI calculated and the rating percentage scale per the ROI implementation (Wagle’s scale). The study includes an assessment of feedback 360 (Kirkpatrick's model) and Phillips’ ROI Methodology that provides a step-by-step process for collecting data, summarizing and processing the collected information. The data is collected from the company accounting data, the HR budgets, MCFO and the company annual reports for the research periods. All analyzed data and reports are organized and presented in forms of tables, charts, and graphs. The paper also gives a brief description of some constrains of the research considered. After ROI calculation, the study reveals that ROI ranges between the average implementation (65% to 75%) by Wagle’s scale that can be considered as a positive outcome. The paper also gives some recommendations how to use ROI in practice and describes main benefits of ROI implementation.Keywords: ROI, organizational performance, efficacy of T&D program, employee performance
Procedia PDF Downloads 2501442 Optimization of Ultrasound Assisted Extraction and Characterization of Functional Properties of Dietary Fiber from Oat Cultivar S2000
Authors: Muhammad Suhail Ibrahim, Muhammad Nadeem, Waseem Khalid, Ammara Ainee, Taleeha Roheen, Sadaf Javaria, Aftab Ahmed, Hira Fatima, Mian Nadeem Riaz, Muhammad Zubair Khalid, Isam A. Mohamed Ahmed J, Moneera O. Aljobair
Abstract:
This study was executed to explore the efficacy of ultrasound-assisted extraction of dietary fiber from oat cultivar S2000. Extraction (variables time, temperature and amplitude) was optimized by using response surface methodology (RSM) conducted by Box Behnken Design (BBD). The effect of time, temperature and amplitude were studied at three levels. It was observed that time and temperature exerted more impact on extraction efficiency as compared to amplitude. The highest yield of total dietary fiber (TDF), soluble dietary fiber (SDF) and In-soluble dietary fiber (IDF) fractions were observed under ultrasound processing for 20 min at 40 ◦C with 80% amplitude. Characterization of extracted dietary fiber showed that it had better crystallinity, thermal properties and good fibrous structure. It also showed better functional properties as compared to traditionally extracted dietary fiber. Furthermore, dietary fibers from oats may offer high-value utilization and the expansion of comprehensive utilization in functional food and nutraceutical development.Keywords: extraction, ultrasonication, response surface methodology, box behnken design
Procedia PDF Downloads 511441 Role of mHealth in Effective Response to Disaster
Authors: Mohammad H. Yarmohamadian, Reza Safdari, Nahid Tavakoli
Abstract:
In recent years, many countries have suffered various natural disasters. Disaster response continues to face the challenges in health care sector in all countries. Information and communication management is a significant challenge in disaster scene. During the last decades, rapid advances in information technology have led to manage information effectively and improve communication in health care setting. Information technology is a vital solution for effective response to disasters and emergencies so that if an efficient ICT-based health information system is available, it will be highly valuable in such situation. Of that, mobile technology represents a nearly computing technology infrastructure that is accessible, convenient, inexpensive and easy to use. Most projects have not yet reached the deployment stage, but evaluation exercises show that mHealth should allow faster processing and transport of patients, improved accuracy of triage and better monitoring of unattended patients at a disaster scene. Since there is a high prevalence of cell phones among world population, it is expected the health care providers and managers to take measures for applying this technology for improvement patient safety and public health in disasters. At present there are challenges in the utilization of mhealth in disasters such as lack of structural and financial issues in our country. In this paper we will discuss about benefits and challenges of mhealth technology in disaster setting considering connectivity, usability, intelligibility, communication and teaching for implementing this technology for disaster response.Keywords: information technology, mhealth, disaster, effective response
Procedia PDF Downloads 4401440 Balancing the Need for Closure: A Requirement for Effective Mood Development in Flow
Authors: Cristian Andrei Nica
Abstract:
The state of flow relies on cognitive elements that sustain openness for information processing in order to promote goal attainment. However, the need for closure may create mental constraints, which can impact affectivity levels. This study aims to observe the extent in which need for closure moderates the interaction between flow and affectivity, taking into account the mediating role of the mood repair motivation in the interaction process between need for closure and affectivity. Using a non-experimental, correlational design, n=73 participants n=18 men and n=55 women, ages between 19-64 years (m= 28.02) (SD=9.22), completed the Positive Affectivity-Negative Affectivity Schedule, the need for closure scale-revised, the mood repair items and an adapted version of the flow state scale 2, in order to assess the trait aspects of flow. Results show that need for closure significantly moderates the flow-affectivity process, while the tolerance of ambiguity sub-scale is positively associated with negative affectivity and negatively to positive affectivity. At the same time, mood repair motivation significantly mediates the interaction between need for closure and positive affectivity, whereas the mediation process for negative affectivity is insignificant. Need for closure needs to be considered when promoting the development of positive emotions. It has been found that the motivation to repair one’s mood mediates the interaction between need for closure and positive affectivity. According to this study, flow can trigger positive emotions when the person is willing to engage in mood regulation strategies and approach meaningful experiences with an open mind.Keywords: flow, mood regulation, mood repair motivation, need for closure, negative affectivity, positive affectivity
Procedia PDF Downloads 1221439 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning
Authors: Federico Pittino, Thomas Arnold
Abstract:
The shredding of waste materials is a key step in the recycling process towards the circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need for frequent maintenance of critical components. Maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for one year, and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for the efficient operation of industrial shredders.Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning
Procedia PDF Downloads 1251438 Stability of Novel Peptides (Linusorbs) in Flaxseed Meal Fortified Gluten-Free Bread
Authors: Youn Young Shim, Martin J. T. Reaney
Abstract:
Flaxseed meal is rich in water-soluble gums and, as such, can improve texture in gluten-free products. Flaxseed bioactive-antioxidant peptides, linusorbs (LOs, a.k.a. cyclolinopeptides), are a class of molecules that may contribute health-promoting effects. The effects of dough preparation, baking, and storage on flaxseed-derived LOs stability in doughs and baked products are un-known. Gluten-free (GF) bread dough and bread were prepared with flaxseed meal and the LO content was determined in the flaxseed meal, bread flour containing the flaxseed meal, bread dough, and bread. The LO contents during storage (0, 1, 2, and 4 weeks) at different temperatures (−18 °C, 4 °C, and 22−23 °C) were determined by high-performance liquid chromatog-raphy-diode array detection (HPLC-DAD). The content of oxidized LOs like [1–9-NαC],[1(Rs, Ss)-MetO]-linusorb B2 (LO14) were substantially constant in flaxseed meal and flour produced from flaxseed meal under all conditions for up to 4 weeks. However, during GF-bread production LOs decreased. Due to microbial contamination dough could not be stored at either 4 or 21°C, and bread could only be stored for one week at 21°C. Up to 4 weeks storage was possible for bread and dough at −18 °C and bread at 4 °C without the loss of LOs. The LOs change mostly from processing and less so from storage. The concentration of reduced LOs in flour and meal were much higher than measured in dough and bread. There was not a corre-sponding increase in oxidized LOs. The LOs in flaxseed meal-fortified bread were stable for products stored at low temperatures. This study is the first of the impact of baking conditions on LO content and quality.Keywords: flaxseed, stability, gluten-free, antioxidant
Procedia PDF Downloads 881437 Pushover Analysis of a Typical Bridge Built in Central Zone of Mexico
Authors: Arturo Galvan, Jatziri Y. Moreno-Martinez, Daniel Arroyo-Montoya, Jose M. Gutierrez-Villalobos
Abstract:
Bridges are one of the most seismically vulnerable structures on highway transportation systems. The general process for assessing the seismic vulnerability of a bridge involves the evaluation of its overall capacity and demand. One of the most common procedures to obtain this capacity is by means of pushover analysis of the structure. Typically, the bridge capacity is assessed using non-linear static methods or non-linear dynamic analyses. The non-linear dynamic approaches use step by step numerical solutions for assessing the capacity with the consuming computer time inconvenience. In this study, a nonlinear static analysis (‘pushover analysis’) was performed to predict the collapse mechanism of a typical bridge built in the central zone of Mexico (Celaya, Guanajuato). The bridge superstructure consists of three simple supported spans with a total length of 76 m: 22 m of the length of extreme spans and 32 m of length of the central span. The deck width is of 14 m and the concrete slab depth is of 18 cm. The bridge is built by means of frames of five piers with hollow box-shaped sections. The dimensions of these piers are 7.05 m height and 1.20 m diameter. The numerical model was created using a commercial software considering linear and non-linear elements. In all cases, the piers were represented by frame type elements with geometrical properties obtained from the structural project and construction drawings of the bridge. The deck was modeled with a mesh of rectangular thin shell (plate bending and stretching) finite elements. The moment-curvature analysis was performed for the sections of the piers of the bridge considering in each pier the effect of confined concrete and its reinforcing steel. In this way, plastic hinges were defined on the base of the piers to carry out the pushover analysis. In addition, time history analyses were performed using 19 accelerograms of real earthquakes that have been registered in Guanajuato. In this way, the displacements produced by the bridge were determined. Finally, pushover analysis was applied through the control of displacements in the piers to obtain the overall capacity of the bridge before the failure occurs. It was concluded that the lateral deformation of the piers due to a critical earthquake occurred in this zone is almost imperceptible due to the geometry and reinforcement demanded by the current design standards and compared to its displacement capacity, they were excessive. According to the analysis, it was found that the frames built with five piers increase the rigidity in the transverse direction of the bridge. Hence it is proposed to reduce these frames of five piers to three piers, maintaining the same geometrical characteristics and the same reinforcement in each pier. Also, the mechanical properties of materials (concrete and reinforcing steel) were maintained. Once a pushover analysis was performed considering this configuration, it was concluded that the bridge would continue having a “correct” seismic behavior, at least for the 19 accelerograms considered in this study. In this way, costs in material, construction, time and labor would be reduced in this study case.Keywords: collapse mechanism, moment-curvature analysis, overall capacity, push-over analysis
Procedia PDF Downloads 1521436 Natural Fibre Composite Structural Sections for Residential Stud Wall Applications
Authors: Mike R. Bambach
Abstract:
Increasing awareness of environmental concerns is leading a drive towards more sustainable structural products for the built environment. Natural fibres such as flax, jute and hemp have recently been considered for fibre-resin composites, with a major motivation for their implementation being their notable sustainability attributes. While recent decades have seen substantial interest in the use of such natural fibres in composite materials, much of this research has focused on the materials aspects, including fibre processing techniques, composite fabrication methodologies, matrix materials and their effects on the mechanical properties. The present study experimentally investigates the compression strength of structural channel sections of flax, jute and hemp, with a particular focus on their suitability for residential stud wall applications. The section geometry is optimised for maximum strength via the introduction of complex stiffeners in the webs and flanges. Experimental results on both natural fibre composite channel sections and typical steel and timber residential wall studs are compared. The geometrically optimised natural fibre composite channels are shown to have compression capacities suitable for residential wall stud applications, identifying them as a potentially viable alternative to traditional building materials in such application, and potentially other light structural applications.Keywords: channel sections, natural fibre composites, residential stud walls, structural composites
Procedia PDF Downloads 3141435 Preparation and Characterization of Iron/Titanium-Pillared Clays
Authors: Rezala Houria, Valverde Jose Luis, Romero Amaya, Molinari Alessandra, Maldotti Andrea
Abstract:
The escalation of oil prices in 1973 confronted the oil industry with the problem of how to maximize the processing of crude oil, especially the heavy fractions, to give gasoline components. Strong impetus was thus given to the development of catalysts with relatively large pore sizes, which were able to deal with larger molecules than the existing molecular sieves, and with good thermal and hydrothermal stability. The oil embargo in 1973 therefore acted as a stimulus for the investigation and development of pillared clays. Iron doped titania-pillared montmorillonite clays was prepared using bentonite from deposits of Maghnia in western-Algeria. The preparation method consists of differents steps (purification of the raw bentonite, preparation of a pillaring agent solution and exchange of the cations located between the clay layers with the previously formed iron/titanium solution). The characterization of this material was carried out by X-ray fluorescence spectrometry, X-ray diffraction, textural measures by BET method, inductively coupled plasma atomic emission spectroscopy, diffuse reflectance UV visible spectroscopy, temperature- programmed desorption of ammonia and atomic absorption.This new material was investigated as photocatalyst for selective oxygenation of the liquid alkylaromatics such as: toluene, paraxylene and orthoxylene and the photocatalytic properties of it were compared with those of the titanium-pillared clays.Keywords: iron doping, montmorillonite clays, pillared clays, oil industry
Procedia PDF Downloads 3021434 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model
Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula
Abstract:
In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service
Procedia PDF Downloads 991433 Process Optimization for Albanian Crude Oil Characterization
Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici
Abstract:
Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.Keywords: TBP distillation curves, crude oil, optimization, simulation
Procedia PDF Downloads 3041432 Implementation Association Rule Method in Determining the Layout of Qita Supermarket as a Strategy in the Competitive Retail Industry in Indonesia
Authors: Dwipa Rizki Utama, Hanief Ibrahim
Abstract:
The development of industry retail in Indonesia is very fast, various strategy was undertaken to boost the customer satisfaction and the productivity purchases to boost the profit, one of which is implementing strategies layout. The purpose of this study is to determine the layout of Qita supermarket, a retail industry in Indonesia, in order to improve customer satisfaction and to maximize the rate of products’ sale as a whole, so as the infrequently purchased products will be purchased. This research uses a literature study method, and one of the data mining methods is association rule which applied in market basket analysis. Data were tested amounted 100 from 160 after pre-processing data, so then the distribution department and 26 departments corresponding to the data previous layout will be obtained. From those data, by the association rule method, customer behavior when purchasing items simultaneously can be studied, so then the layout of the supermarket based on customer behavior can be determined. Using the rapid miner software by the minimal support 25% and minimal confidence 30% showed that the 14th department purchased at the same time with department 10, 21st department purchased at the same time with department 13, 15th department purchased at the same time with department 12, 14th department purchased at the same time with department 12, and 10th department purchased at the same time with department 14. From those results, a better supermarket layout can be arranged than the previous layout.Keywords: industry retail, strategy, association rule, supermarket
Procedia PDF Downloads 1891431 Design of a Photovoltaic Power Generation System Based on Artificial Intelligence and Internet of Things
Authors: Wei Hu, Wenguang Chen, Chong Dong
Abstract:
In order to improve the efficiency and safety of photovoltaic power generation devices, this photovoltaic power generation system combines Artificial Intelligence (AI) and the Internet of Things (IoT) to control the chasing photovoltaic power generation device to track the sun to improve power generation efficiency and then convert energy management. The system uses artificial intelligence as the control terminal, the power generation device executive end uses the Linux system, and Exynos4412 is the CPU. The power generating device collects the sun image information through Sony CCD. After several power generating devices feedback the data to the CPU for processing, several CPUs send the data to the artificial intelligence control terminal through the Internet. The control terminal integrates the executive terminal information, time information, and environmental information to decide whether to generate electricity normally and then whether to convert the converted electrical energy into the grid or store it in the battery pack. When the power generation environment is abnormal, the control terminal authorizes the protection strategy, the power generation device executive terminal stops power generation and enters a self-protection posture, and at the same time, the control terminal synchronizes the data with the cloud. At the same time, the system is more intelligent, more adaptive, and longer life.Keywords: photo-voltaic power generation, the pursuit of light, artificial intelligence, internet of things, photovoltaic array, power management
Procedia PDF Downloads 1231430 Information and Cooperativity in Fiction: The Pragmatics of David Baboulene’s “Knowledge Gaps”
Authors: Cara DiGirolamo
Abstract:
In his 2017 Ph.D. thesis, script doctor David Baboulene presented a theory of fiction in which differences in the knowledge states between participants in a literary experience, including reader, author, and characters, create many story elements, among them suspense, expectations, subtext, theme, metaphor, and allegory. This theory can be adjusted and modeled by incorporating a formal pragmatic approach that understands narrative as a speech act with a conversational function. This approach requires both the Speaker and the Listener to be understood as participants in the discourse. It also uses theories of cooperativity and the QUD to identify the existence of implicit questions. This approach predicts that what an effective literary narrative must do: provide a conversational context early in the story so the reader can engage with the text as a conversational participant. In addition, this model incorporates schema theory. Schema theory is a cognitive model for learning and processing information about the world and transforming it into functional knowledge. Using this approach can extend the QUD model. Instead of describing conversation as a form of information gathering restricted to question-answer sets, the QUD can include knowledge modeling and understanding as a possible outcome of a conversation. With this model, Baboulene’s “Knowledge Gaps” can provide real insight into storytelling as a conversational move, and extend the QUD to be able to simply and effectively apply to a more diverse set of conversational interactions and also to narrative texts.Keywords: literature, speech acts, QUD, literary theory
Procedia PDF Downloads 121429 A Transformer-Based Question Answering Framework for Software Contract Risk Assessment
Authors: Qisheng Hu, Jianglei Han, Yue Yang, My Hoa Ha
Abstract:
When a company is considering purchasing software for commercial use, contract risk assessment is critical to identify risks to mitigate the potential adverse business impact, e.g., security, financial and regulatory risks. Contract risk assessment requires reviewers with specialized knowledge and time to evaluate the legal documents manually. Specifically, validating contracts for a software vendor requires the following steps: manual screening, interpreting legal documents, and extracting risk-prone segments. To automate the process, we proposed a framework to assist legal contract document risk identification, leveraging pre-trained deep learning models and natural language processing techniques. Given a set of pre-defined risk evaluation problems, our framework utilizes the pre-trained transformer-based models for question-answering to identify risk-prone sections in a contract. Furthermore, the question-answering model encodes the concatenated question-contract text and predicts the start and end position for clause extraction. Due to the limited labelled dataset for training, we leveraged transfer learning by fine-tuning the models with the CUAD dataset to enhance the model. On a dataset comprising 287 contract documents and 2000 labelled samples, our best model achieved an F1 score of 0.687.Keywords: contract risk assessment, NLP, transfer learning, question answering
Procedia PDF Downloads 129