Search results for: real excess portfolio returns
1800 Testing of Protective Coatings on Automotive Steel, a Correlation Between Salt Spray, Electrochemical Impedance Spectroscopy, and Linear Polarization Resistance Test
Authors: Dhanashree Aole, V. Hariharan, Swati Surushe
Abstract:
Corrosion can cause serious and expensive damage to the automobile components. Various proven techniques for controlling and preventing corrosion depend on the specific material to be protected. Electrochemical Impedance Spectroscopy (EIS) and salt spray tests are commonly used to assess the corrosion degradation mechanism of coatings on metallic surfaces. While, the only test which monitors the corrosion rate in real time is known as Linear Polarisation Resistance (LPR). In this study, electrochemical tests (EIS & LPR) and spray test are reviewed to assess the corrosion resistance and durability of different coatings. The main objective of this study is to correlate the test results obtained using linear polarization resistance (LPR) and Electrochemical Impedance Spectroscopy (EIS) with the results obtained using standard salt spray test. Another objective of this work is to evaluate the performance of various coating systems- CED, Epoxy, Powder coating, Autophoretic, and Zn-trivalent coating for vehicle underbody application. The corrosion resistance coating are assessed. From this study, a promising correlation between different corrosion testing techniques is noted. The most profound observation is that electrochemical tests gives quick estimation of corrosion resistance and can detect the degradation of coatings well before visible signs of damage appear. Furthermore, the corrosion resistances and salt spray life of the coatings investigated were found to be according to the order as follows- CED> powder coating > Autophoretic > epoxy coating > Zn- Trivalent plating.Keywords: Linear Polarization Resistance (LPR), Electrochemical Impedance Spectroscopy (EIS), salt spray test, sacrificial and barrier coatings
Procedia PDF Downloads 5271799 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 2841798 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement
Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini
Abstract:
Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis
Procedia PDF Downloads 1391797 Key Performance Indicators of Cold Supply Chain Practices in Agriculture Sector: Empirical Study on the Egyptian Export Companies
Authors: Ahmed Barakat, Nourhan Ahmed Saad, Mahmoud Hammad
Abstract:
Tracking and monitoring agricultural products, cold chain activities, and transportation in real-time can effectively ensure both the quality and safety of agricultural products, as well as reduce overall logistics costs. Effective supply chain practices are one of the main requirements for enhancing agricultural business in Egypt. Cold chain is among the best practices for the storage and transportation of perishable goods and has potential within the agricultural sector in Egypt. This practice has the scope of reducing the wastage of food and increasing the profitability with a reduction in costs. Even though it has several implementation challenges for the farmers, traders, and people involved in the entire supply chain, it has highlighted better benefits for all and for the export of goods for the economic progression for Egypt. The aim of this paper is to explore cold supply chain practices for the agriculture sector in Egypt, to enhance the export performance of fresh goods. In this context, this study attempts to explore those aspects of the performance of cold supply chain practices that can enhance the functioning of the agriculture sector in Egypt from the perspective of export companies (traders) and farmers. Based on the empirical results obtained by data collection from the farmers and traders, the study argues that there is a significant association between cold supply chain practices and enhancement of the agriculture value chain. The paper thus highlights the contribution of the study with final conclusions and limitations with scope for future research.Keywords: agriculture sector, cold chain management, export companies, non-traded goods, supply chain management
Procedia PDF Downloads 1621796 Sustainability Assessment of Food Delivery with Last-Mile Delivery Droids, A Case Study at the European Commission's JRC Ispra Site
Authors: Ada Garus
Abstract:
This paper presents the outcomes of the sustainability assessment of food delivery with a last-mile delivery service introduced in a real-world case study. The methodology used in the sustainability assessment integrates multi-criteria decision-making analysis, sustainability pillars, and scenario analysis to best reflect the conflicting needs of stakeholders involved in the last mile delivery system. The case study provides an application of the framework to the food delivery system of the Joint Research Centre of the European Commission where three alternative solutions were analyzed I) the existent state in which individuals frequent the local cantine or pick up their food, using their preferred mode of transport II) the hypothetical scenario in which individuals can only order their food using the delivery droid system III) a scenario in which the food delivery droid based system is introduced as a supplement to the current system. The environmental indices are calculated using a simulation study in which decision regarding the food delivery is predicted using a multinomial logit model. The vehicle dynamics model is used to predict the fuel consumption of the regular combustion engines vehicles used by the cantine goers and the electricity consumption of the droid. The sustainability assessment allows for the evaluation of the economic, environmental, and social aspects of food delivery, making it an apt input for policymakers. Moreover, the assessment is one of the first studies to investigate automated delivery droids, which could become a frequent addition to the urban landscape in the near future.Keywords: innovations in transportation technologies, behavioural change and mobility, urban freight logistics, innovative transportation systems
Procedia PDF Downloads 1941795 Color Image Compression/Encryption/Contour Extraction using 3L-DWT and SSPCE Method
Authors: Ali A. Ukasha, Majdi F. Elbireki, Mohammad F. Abdullah
Abstract:
Data security needed in data transmission, storage, and communication to ensure the security. This paper is divided into two parts. This work interests with the color image which is decomposed into red, green and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using the key image that has same original size and are generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours from color images recovery can be obtained with accepted level of distortion using single step parallel contour extraction (SSPCE) method. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Color images and completely reconstructed without any distortion. Also shown that the analyzed algorithm has extremely large security against some attacks like salt and pepper and Jpeg compression. Its proof that the color images can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.Keywords: SSPCE method, image compression and salt and peppers attacks, bitplanes decomposition, Arnold transform, color image, wavelet transform, lossless image encryption
Procedia PDF Downloads 5201794 Pavement Quality Evaluation Using Intelligent Compaction Technology: Overview of Some Case Studies in Oklahoma
Authors: Sagar Ghos, Andrew E. Elaryan, Syed Ashik Ali, Musharraf Zaman, Mohammed Ashiqur Rahman
Abstract:
Achieving desired density during construction is an important indicator of pavement quality. Insufficient compaction often compromises pavement performance and service life. Intelligent compaction (IC) is an emerging technology for monitoring compaction quality during the construction of asphalt pavements. This paper aims to provide an overview of findings from four case studies in Oklahoma involving the compaction quality of asphalt pavements, namely SE 44th St project (Project 1) and EOC Turnpike project (Project 2), Highway 92 project (Project 3), and 108th Avenue project (Project 4). For this purpose, an IC technology, the intelligent compaction analyzer (ICA), developed at the University of Oklahoma, was used to evaluate compaction quality. Collected data include GPS locations, roller vibrations, roller speed, the direction of movement, and temperature of the asphalt mat. The collected data were analyzed using a widely used software, VETA. The average densities for Projects 1, 2, 3 and 4, were found as 89.8%, 91.50%, 90.7% and 87.5%, respectively. The maximum densities were found as 94.6%, 95.8%, 95.9%, and 89.7% for Projects 1, 2, 3, and 4, respectively. It was observed that the ICA estimated densities correlated well with the field core densities. The ICA results indicated that at least 90% of the asphalt mats were subjected to at least two roller passes. However, the number of passes required to achieve the desired density (94% to 97%) differed from project to project depending on the underlying layer. The results of these case studies show both opportunities and challenges in using IC for monitoring compaction quality during construction in real-time.Keywords: asphalt pavement construction, density, intelligent compaction, intelligent compaction analyzer, intelligent compaction measure value
Procedia PDF Downloads 1581793 Air Quality Assessment for a Hot-Spot Station by Neural Network Modelling of the near-Traffic Emission-Immission Interaction
Authors: Tim Steinhaus, Christian Beidl
Abstract:
Urban air quality and climate protection are two major challenges for future mobility systems. Despite the steady reduction of pollutant emissions from vehicles over past decades, local immission load within cities partially still reaches heights, which are considered hazardous to human health. Although traffic-related emissions account for a major part of the overall urban pollution, modeling the exact interaction remains challenging. In this paper, a novel approach for the determination of the emission-immission interaction on the basis of neural network modeling for traffic induced NO2-immission load within a near-traffic hot-spot scenario is presented. In a detailed sensitivity analysis, the significance of relevant influencing variables on the prevailing NO2 concentration is initially analyzed. Based on this, the generation process of the model is described, in which not only environmental influences but also the vehicle fleet composition including its associated segment- and certification-specific real driving emission factors are derived and used as input quantities. The validity of this approach, which has been presented in the past, is re-examined in this paper using updated data on vehicle emissions and recent immission measurement data. Within the framework of a final scenario analysis, the future development of the immission load is forecast for different developments in the vehicle fleet composition. It is shown that immission levels of less than half of today’s yearly average limit values are technically feasible in hot-spot situations.Keywords: air quality, emission, emission-immission-interaction, immission, NO2, zero impact
Procedia PDF Downloads 1271792 Adaptable Buildings for More Sustainable Housing: Energy Life Cycle Analysis
Authors: Rafael Santos Fischer, Aloísio Leoni Schmid, Amanda Dalla-Bonna
Abstract:
The life cycle analysis and the energy life cycle analysis are useful design support tools when sustainability becomes imperative. The final phase of buildings life cycle is probably the least known, on which less knowledge is available. In the Brazilian building industry, the lifespan of a building design rarely is treated as a definite design parameter. There is rather a common sense attitude to take any building demands as permanent, and to take for granted that buildings solutions are durable and solid. Housing, being a permanent issue in any society, presents a real challenge to the choice of a design lifespan. In Brazilian history, there was a contrast of the native solutions of collective, non-durable houses built by several nomadic tribes, and the stone and masonry buildings introduced by the sedentary Portuguese conquerors. Durable buildings are commonly associated with welfare. However, social dynamics makes traditional families of both parents and children be just one of several possible arrangements. In addition, a more liberal attitude towards family leads to an increase in the number of people living in alternative arrangements. Japan is an example of country where houses have been made intentionally ephemeral since the half of 20th century. The present article presents the development of a flexible housing design solution on the basis of the Design Science Research approach. A comparison in terms of energy life cycle shows how flexibility and dematerialization may point at a feasible future for housing policies in Brazil.Keywords: adaptability, adaptable building, embodied energy, life cyclce analysis, social housing
Procedia PDF Downloads 5891791 The Effect of Fetal Movement Counting on Maternal Antenatal Attachment
Authors: Esra Güney, Tuba Uçar
Abstract:
Aim: This study has been conducted for the purpose of determining the effects of fetal movement counting on antenatal maternal attachment. Material and Method: This research was conducted on the basis of the real test model with the pre-test /post-test control groups. The study population consists of pregnant women registered in the six different Family Health Centers located in the central Malatya districts of Yeşilyurt and Battalgazi. When power analysis is done, the sample size was calculated for each group of at least 55 pregnant women (55 tests, 55 controls). The data were collected by using Personal Information Form and MAAS (Maternal Antenatal Attachment Scale) between July 2015-June 2016. Fetal movement counting training was given to pregnant women by researchers in the experimental group after the pre-test data collection. No intervention was applied to the control group. Post-test data for both groups were collected after four weeks. Data were evaluated with percentage, chi-square arithmetic average, chi-square test and as for the dependent and independent group’s t test. Result: In the MAAS, the pre-test average of total scores in the experimental group is 70.78±6.78, control group is also 71.58±7.54 and so there was no significant difference in mean scores between the two groups (p>0.05). MAAS post-test average of total scores in the experimental group is 78.41±6.65, control group is also is 72.25±7.16 and so the mean scores between groups were found to have statistically significant difference (p<0.05). Conclusion: It was determined that fetal movement counting increases the maternal antenatal attachments.Keywords: antenatal maternal attachment, fetal movement counting, pregnancy, midwifery
Procedia PDF Downloads 2731790 Society and Cinema in Iran
Authors: Seyedeh Rozhano Azimi Hashemi
Abstract:
There is no doubt that ‘Art’ is a social phenomena and cinema is the most social kind of art. Hence, it’s clear that we can analyze the relation’s of cinema and art from different aspects. In this paper sociological cinema will be investigated which, is a subdivision of sociological art. This term will be discussed by two main approaches. One of these approaches is focused on the effects of cinema on the society, which is known as “Effects Theory” and the second one, which is dealing with the reflection of social issues in cinema is called ” Reflection Theory”. "Reflect theory" approach, unlike "Effects theory" is considering movies as documents, in which social life is reflected, and by analyzing them, the changes and tendencies of a society are understood. Criticizing these approaches to cinema and society doesn’t mean that they are not real. Conversely, it proves the fact that for better understanding of cinema and society’s relation, more complicated models are required, which should consider two aspects. First, they should be bilinear and they should provide a dynamic and active relation between cinema and society, as for the current concept social life and cinema have bi-linear effects on each other, and that’s how they fit in a dialectic and dynamic process. Second, it should pay attention to the role of inductor elements such as small social institutions, marketing, advertisements, cultural pattern, art’s genres and popular cinema in society. In the current study, image of middle class in cinema of Iran and changing the role of women in cinema and society which were two bold issue that cinema and society faced since 1979 revolution till 80s are analyzed. Films as an artwork on one hand, are reflections of social changes and with their effects on the society on the other hand, are trying to speed up the trends of these changes. Cinema by the illustration of changes in ideologies and approaches in exaggerated ways and through it’s normalizing functions, is preparing the audiences and public opinions for the acceptance of these changes. Consequently, audience takes effect from this process, which is a bi-linear and interactive process.Keywords: Iranian Cinema, Cinema and Society, Middle Class, Woman’s Role
Procedia PDF Downloads 3421789 Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule
Authors: Ming-Jong Yao, Chin-Sum Shui, Chih-Han Wang
Abstract:
This paper is developed based on a real-world decision scenario that an industrial gas company that applies the Vendor Managed Inventory model and supplies liquid oxygen with a self-operated heterogeneous vehicle fleet to hospitals in nearby cities. We name it as a Joint Replenishment and Heterogeneous Vehicle Routing Problem with Cyclical Schedule and formulate it as a non-linear mixed-integer linear programming problem which simultaneously determines the length of the planning cycle (PC), the length of the replenishment cycle and the dates of replenishment for each customer and the vehicle routes of each day within PC, such that the average daily operation cost within PC, including inventory holding cost, setup cost, transportation cost, and overtime labor cost, is minimized. A solution method based on genetic algorithm, embedded with an encoding and decoding mechanism and local search operators, is then proposed, and the hash function is adopted to avoid repetitive fitness evaluation for identical solutions. Numerical experiments demonstrate that the proposed solution method can effectively solve the problem under different lengths of PC and number of customers. The method is also shown to be effective in determining whether the company should expand the storage capacity of a customer whose demand increases. Sensitivity analysis of the vehicle fleet composition shows that deploying a mixed fleet can reduce the daily operating cost.Keywords: cyclic inventory routing problem, joint replenishment, heterogeneous vehicle, genetic algorithm
Procedia PDF Downloads 881788 Emulsified Oil Removal in Produced Water by Graphite-Based Adsorbents Using Adsorption Coupled with Electrochemical Regeneration
Authors: Zohreh Fallah, Edward P. L. Roberts
Abstract:
One of the big challenges for produced water treatment is removing oil from water in the form of emulsified droplets which are not easily separated. An attractive approach is adsorption, as it is a simple and effective process. However, adsorbents must be regenerated in order to make the process cost effective. Several sorbents have been tested for treating oily wastewater. However, some issues such as high energy consumption for activated carbon thermal regeneration have been reported. Due to their significant electrical conductivity, Graphite Intercalation Compounds (GIC) were found to be suitable to be regenerated electrochemically. They are non-porous materials with low surface area and fast adsorptive capacity which are useful for removal of low concentration of organics. An innovative adsorption/regeneration process has been developed at the University of Manchester in which adsorption of organics are done by using a patented GIC adsorbent coupled with subsequent electrochemical regeneration. The oxidation of adsorbed organics enables 100% regeneration so that the adsorbent can be reused over multiple adsorption cycles. GIC adsorbents are capable of removing a wide range of organics and pollutants; however, no comparable report is available for removal of emulsified oil in produced water using abovementioned process. In this study the performance of this technology for the removal of emulsified oil in wastewater was evaluated. Batch experiments were carried out to determine the adsorption kinetics and equilibrium isotherm for both real produced water and model emulsions. The amount of oil in wastewater was measured by using the toluene extraction/fluorescence analysis before and after adsorption and electrochemical regeneration cycles. It was found that oil in water emulsion could be successfully treated by the treatment process and More than 70% of oil was removed.Keywords: adsorption, electrochemical regeneration, emulsified oil, produced water
Procedia PDF Downloads 5821787 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition
Authors: Aisultan Shoiynbek, Darkhan Kuanyshbay, Paulo Menezes, Akbayan Bekarystankyzy, Assylbek Mukhametzhanov, Temirlan Shoiynbek
Abstract:
Speech emotion recognition (SER) has received increasing research interest in recent years. It is a common practice to utilize emotional speech collected under controlled conditions recorded by actors imitating and artificially producing emotions in front of a microphone. There are four issues related to that approach: emotions are not natural, meaning that machines are learning to recognize fake emotions; emotions are very limited in quantity and poor in variety of speaking; there is some language dependency in SER; consequently, each time researchers want to start work with SER, they need to find a good emotional database in their language. This paper proposes an approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describes the sequence of actions involved in the proposed approach. One of the first objectives in the sequence of actions is the speech detection issue. The paper provides a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To investigate the working capacity of the developed model, an analysis of speech detection and extraction from real tasks has been performed.Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset
Procedia PDF Downloads 271786 The Effect Analysis of Monetary Instruments through Islamic Banking Financing Channel toward Economic Growth in Indonesia, Period January 2008-December 2015
Authors: Sobar M. Johari, Ida Putri Anjarsari
Abstract:
In the transmission of monetary instrument towards real sector of the economy, Bank Indonesia as monetary authority has developed Islamic Bank Indonesia Certificate (abbreviated as SBIS) as an instrument in Islamic open market operation. One of the monetary transmission channels could take place through financing channel from which the fund is used as the source of banking financing. This study aims to analyse the impact of Islamic monetary instrument towards output or economic growth. Data used in this research is taken from Bank Indonesia and Central Board of Statistics for the period of January 2008 until December 2015. The study employs Granger Causality Test, Vector Error Correction Model (VECM), Impulse Response Function (IRF) technique and Forecast Error Variance Decomposition (FEVD) as its analytical methods. The results show that, first, the transmission mechanism of banking financing channel are not linked to output. Second, estimation results of VECM show that SBIS, PUAS, and FIN have significant impact in the long term towards output. When there is monetary shock, output or economic growth could be recovered and stabilized in the short term. FEVD results show that Islamic banking financing contributes 1.33 percent to increase economic growth.Keywords: Islamic monetary instrument, Islamic banking financing channel, economic growth, Vector Error Correction Model (VECM)
Procedia PDF Downloads 2831785 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes
Authors: Ritwik Dutta, Marylin Wolf
Abstract:
This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver
Procedia PDF Downloads 3911784 A Programming Assessment Software Artefact Enhanced with the Help of Learners
Authors: Romeo A. Botes, Imelda Smit
Abstract:
The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.Keywords: programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method
Procedia PDF Downloads 2961783 Design of New Sustainable Pavement Concrete: An Experimental Road
Authors: Manuel Rosales, Francisco Agrela, Julia Rosales
Abstract:
The development of concrete pavements that include recycled waste with active and predictive safety features is a possible approach to mitigate the harmful impacts of the construction industry, such as CO2 emissions and the consumption of energy and natural resources during the construction and maintenance of road infrastructure. This study establishes the basis for formulating new smart materials for concrete pavements and carrying out the in-situ implementation of an experimental road section. To this end, a comprehensive recycled pavement solution is developed that combines eco-hybrid cement made with 25% mixed recycled aggregate powder (pMRA) and biomass bottom ash powder (pBBA) and a 30% substitution of natural aggregate by MRA and BBA. This work is grouped in three lines. 1) construction materials with high rates of use of recycled material, 2) production processes with efficient consumption of natural resources and use of cleaner energies, and 3) implementation and monitoring of road section with sustainable concrete made from waste. The objective of this study is to ensure satisfactory rheology, mechanical strength, durability, and CO2 capture of pavement concrete manufactured from waste and its subsequent application in real road section as well as its monitoring to establish the optimal range of recycled material. The concrete developed during this study are aimed at the reuse of waste, promoting the circular economy. For this purpose, and after having carried out different tests in the laboratory, three mixtures were established to be applied on the experimental road.Keywords: biomass bottom ash, construction and demolition waste, recycled concrete pavements, full-scale experimental road, monitoring
Procedia PDF Downloads 681782 Influence of Low and Extreme Heat Fluxes on Thermal Degradation of Carbon Fibre-Reinforced Polymers
Authors: Johannes Bibinger, Sebastian Eibl, Hans-Joachim Gudladt
Abstract:
This study considers the influence of different irradiation scenarios on the thermal degradation of carbon fiber-reinforced polymers (CFRP). Real threats are simulated, such as fires with long-lasting low heat fluxes and nuclear heat flashes with short-lasting high heat fluxes. For this purpose, coated and uncoated quasi-isotropic samples of the commercially available CFRP HexPly® 8552/IM7 are thermally irradiated from one side by a cone calorimeter and a xenon short-arc lamp with heat fluxes between 5 and 175 W/cm² at varying time intervals. The specimen temperature is recorded on the front and backside as well as at different laminate depths. The CFRP is non-destructively tested with ultrasonic testing, infrared spectroscopy (ATR-FTIR), scanning electron microscopy (SEM), and micro-focused computed X-Ray tomography (μCT). Destructive tests are performed to evaluate the mechanical properties in terms of interlaminar shear strength (ILSS), compressive and tensile strength. The irradiation scenarios vary significantly in heat flux and exposure time. Thus, different heating rates, radiation effects, and temperature distributions occur. This leads to unequal decomposition processes, which affect the sensitivity of the strength type and damage behaviour of the specimens. However, with the use of surface coatings, thermal degradation of composite materials can be delayed.Keywords: CFRP, one-sided thermal damage, high heat flux, heating rate, non-destructive and destructive testing
Procedia PDF Downloads 1131781 Spatial Rank-Based High-Dimensional Monitoring through Random Projection
Authors: Chen Zhang, Nan Chen
Abstract:
High-dimensional process monitoring becomes increasingly important in many application domains, where usually the process distribution is unknown and much more complicated than the normal distribution, and the between-stream correlation can not be neglected. However, since the process dimension is generally much bigger than the reference sample size, most traditional nonparametric multivariate control charts fail in high-dimensional cases due to the curse of dimensionality. Furthermore, when the process goes out of control, the influenced variables are quite sparse compared with the whole dimension, which increases the detection difficulty. Targeting at these issues, this paper proposes a new nonparametric monitoring scheme for high-dimensional processes. This scheme first projects the high-dimensional process into several subprocesses using random projections for dimension reduction. Then, for every subprocess with the dimension much smaller than the reference sample size, a local nonparametric control chart is constructed based on the spatial rank test to detect changes in this subprocess. Finally, the results of all the local charts are fused together for decision. Furthermore, after an out-of-control (OC) alarm is triggered, a diagnostic framework is proposed. using the square-root LASSO. Numerical studies demonstrate that the chart has satisfactory detection power for sparse OC changes and robust performance for non-normally distributed data, The diagnostic framework is also effective to identify truly changed variables. Finally, a real-data example is presented to demonstrate the application of the proposed method.Keywords: random projection, high-dimensional process control, spatial rank, sequential change detection
Procedia PDF Downloads 2991780 TimeTune: Personalized Study Plans Generation with Google Calendar Integration
Authors: Chevon Fernando, Banuka Athuraliya
Abstract:
The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.Keywords: personalized learning, study planner, time management, calendar integration
Procedia PDF Downloads 491779 Effect of Different Muscle Contraction Mode on the Expression of Myostatin, IGF-1, and PGC-1 Alpha Family Members in Human Vastus Lateralis Muscle
Authors: Pejman Taghibeikzadehbadr
Abstract:
Muscle contraction stimulates a transient change of myogenic factors, partly related to the mode of contractions. Here, we assessed the response of Insulin-like growth factor 1Ea (IGF-1Ea), Insulin-like growth factor 1Eb (IGF-1Eb), Insulin-like growth factor 1Ec (IGF-1Ec), Peroxisome proliferator-activated receptor gamma coactivator 1-alpha (PGC1α-1), Peroxisome proliferator-activated receptor gamma coactivator 4-alpha (PGC1α-4), and myostatin to the eccentric Vs the concentric contraction in human skeletal muscle. Ten healthy males were performed an acute eccentric and concentric exercise bout (n = 5 per group). For each contraction type, participants performed 12 sets of 10 repetitions knee extension by the dominant leg. Baseline and post-exercise muscle biopsy were taken 4 weeks before and immediately after experimental sessions from Vastus Lateralis muscle. Genes expression was measured by real-time PCR technique. There was a significant increase in PGC1α-1, PGC1α-4, IGF-1Ea and, IGF-1Eb mRNA after concentric contraction (p ≤ 0.05), while the PGC1α-4 and IGF-1Ec significantly increased after eccentric contraction (p ≤ 0.05). It is intriguing to highlight that; no significant differences between groups were evident for changes in any variables following exercise bouts (p ≥ 0.05). Our results found that concentric and eccentric contractions presented different responses in PGC1α-1, IGF-1Ea, IGF-1Eb, and IGF-1Ec mRNA. However, a similar significant increase in mRNA content was observed in PGC1α-4. Further, no apparent differences could be found between the response of genes to eccentric and concentric contraction.Keywords: eccentric contraction, concentric contraction, gene expression, PGC-1 alpha, IGF-1 Myostatin
Procedia PDF Downloads 1601778 Extended Intuitionistic Fuzzy VIKOR Method in Group Decision Making: The Case of Vendor Selection Decision
Authors: Nastaran Hajiheydari, Mohammad Soltani Delgosha
Abstract:
Vendor (supplier) selection is a group decision-making (GDM) process, in which, based on some predetermined criteria, the experts’ preferences are provided in order to rank and choose the most desirable suppliers. In the real business environment, our attitudes or our choices would be made in an uncertain and indecisive situation could not be expressed in a crisp framework. Intuitionistic fuzzy sets (IFSs) could handle such situations in the best way. VIKOR method was developed to solve multi-criteria decision-making (MCDM) problems. This method, which is used to determine the compromised feasible solution with respect to the conflicting criteria, introduces a multi-criteria ranking index based on the particular measure of 'closeness' to the 'ideal solution'. Until now, there has been a little investigation of VIKOR with IFS, therefore we extended the intuitionistic fuzzy (IF) VIKOR to solve vendor selection problem under IF GDM environment. The present study intends to develop an IF VIKOR method in a GDM situation. Therefore, a model is presented to calculate the criterion weights based on entropy measure. Then, the interval-valued intuitionistic fuzzy weighted geometric (IFWG) operator utilized to obtain the total decision matrix. In the next stage, an approach based on the positive idle intuitionistic fuzzy number (PIIFN) and negative idle intuitionistic fuzzy number (NIIFN) was developed. Finally, the application of the proposed method to solve a vendor selection problem illustrated.Keywords: group decision making, intuitionistic fuzzy set, intuitionistic fuzzy entropy measure, vendor selection, VIKOR
Procedia PDF Downloads 1571777 ACTN3 Genotype Association with Motoric Performance of Roma Children
Authors: J. Bernasovska, I. Boronova, J. Poracova, M. Mydlarova Blascakova, V. Szabadosova, P. Ruzbarsky, E. Petrejcikova, I. Bernasovsky
Abstract:
The paper presents the results of the molecular genetics analysis in sports research, with special emphasis to use genetic information in diagnosing of motoric predispositions in Roma boys from East Slovakia. The ability and move are the basic characteristics of all living organisms. The phenotypes are influenced by a combination of genetic and environmental factors. Genetic tests differ in principle from the traditional motoric tests, because the DNA of an individual does not change during life. The aim of the presented study was to examine motion abilities and to determine the frequency of ACTN3 (R577X) gene in Roma children. Genotype data were obtained from 138 Roma and 155 Slovak boys from 7 to 15 years old. Children were investigated on physical performance level in association with their genotype. Biological material for genetic analyses comprised samples of buccal swabs. Genotypes were determined using Real Time High resolution melting PCR method (Rotor-Gene 6000 Corbett and Light Cycler 480 Roche). The software allows creating reports of any analysis, where information of the specific analysis, normalized and differential graphs and many information of the samples are shown. Roma children of analyzed group legged to non-Romany children at the same age in all the compared tests. The % distribution of R and X alleles in Roma children was different from controls. The frequency of XX genotype was 9.26%, RX 46.33% and RR was 44.41%. The frequency of XX genotype was 9.26% which is comparable to a frequency of an Indian population. Data were analyzed with the ANOVA test.Keywords: ACTN3 gene, R577X polymorphism, Roma children, sport performance, Slovakia
Procedia PDF Downloads 3351776 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining
Authors: Mohsen Farhadloo, Majid Farhadloo
Abstract:
Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis
Procedia PDF Downloads 951775 Using Data Mining in Automotive Safety
Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler
Abstract:
Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact
Procedia PDF Downloads 3821774 Membrane Distillation Process Modeling: Dynamical Approach
Authors: Fadi Eleiwi, Taous Meriem Laleg-Kirati
Abstract:
This paper presents a complete dynamic modeling of a membrane distillation process. The model contains two consistent dynamic models. A 2D advection-diffusion equation for modeling the whole process and a modified heat equation for modeling the membrane itself. The complete model describes the temperature diffusion phenomenon across the feed, membrane, permeate containers and boundary layers of the membrane. It gives an online and complete temperature profile for each point in the domain. It explains heat conduction and convection mechanisms that take place inside the process in terms of mathematical parameters, and justify process behavior during transient and steady state phases. The process is monitored for any sudden change in the performance at any instance of time. In addition, it assists maintaining production rates as desired, and gives recommendations during membrane fabrication stages. System performance and parameters can be optimized and controlled using this complete dynamic model. Evolution of membrane boundary temperature with time, vapor mass transfer along the process, and temperature difference between membrane boundary layers are depicted and included. Simulations were performed over the complete model with real membrane specifications. The plots show consistency between 2D advection-diffusion model and the expected behavior of the systems as well as literature. Evolution of heat inside the membrane starting from transient response till reaching steady state response for fixed and varying times is illustrated.Keywords: membrane distillation, dynamical modeling, advection-diffusion equation, thermal equilibrium, heat equation
Procedia PDF Downloads 2721773 Simulation Aided Life Cycle Sustainability Assessment Framework for Manufacturing Design and Management
Authors: Mijoh A. Gbededo, Kapila Liyanage, Ilias Oraifige
Abstract:
Decision making for sustainable manufacturing design and management requires critical considerations due to the complexity and partly conflicting issues of economic, social and environmental factors. Although there are tools capable of assessing the combination of one or two of the sustainability factors, the frameworks have not adequately integrated all the three factors. Case study and review of existing simulation applications also shows the approach lacks integration of the sustainability factors. In this paper we discussed the development of a simulation based framework for support of a holistic assessment of sustainable manufacturing design and management. To achieve this, a strategic approach is introduced to investigate the strengths and weaknesses of the existing decision supporting tools. Investigation reveals that Discrete Event Simulation (DES) can serve as a rock base for other Life Cycle Analysis frameworks. Simio-DES application optimizes systems for both economic and competitive advantage, Granta CES EduPack and SimaPro collate data for Material Flow Analysis and environmental Life Cycle Assessment, while social and stakeholders’ analysis is supported by Analytical Hierarchy Process, a Multi-Criteria Decision Analysis method. Such a common and integrated framework creates a platform for companies to build a computer simulation model of a real system and assess the impact of alternative solutions before implementing a chosen solution.Keywords: discrete event simulation, life cycle sustainability analysis, manufacturing, sustainability
Procedia PDF Downloads 2791772 Cryptographic Resource Allocation Algorithm Based on Deep Reinforcement Learning
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decision-making problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security) by modeling the multi-job collaborative cryptographic service scheduling problem as a multi-objective optimized job flow scheduling problem and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real-time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: cloud computing, cryptography on-demand service, reinforcement learning, workflow scheduling
Procedia PDF Downloads 181771 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management
Authors: Kenneth Harper
Abstract:
The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups
Procedia PDF Downloads 29