Search results for: input processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5579

Search results for: input processing

2309 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 189
2308 A Hybrid Digital Watermarking Scheme

Authors: Nazish Saleem Abbas, Muhammad Haris Jamil, Hamid Sharif

Abstract:

Digital watermarking is a technique that allows an individual to add and hide secret information, copyright notice, or other verification message inside a digital audio, video, or image. Today, with the advancement of technology, modern healthcare systems manage patients’ diagnostic information in a digital way in many countries. When transmitted between hospitals through the internet, the medical data becomes vulnerable to attacks and requires security and confidentiality. Digital watermarking techniques are used in order to ensure the authenticity, security and management of medical images and related information. This paper proposes a watermarking technique that embeds a watermark in medical images imperceptibly and securely. In this work, digital watermarking on medical images is carried out using the Least Significant Bit (LSB) with the Discrete Cosine Transform (DCT). The proposed methods of embedding and extraction of a watermark in a watermarked image are performed in the frequency domain using LSB by XOR operation. The quality of the watermarked medical image is measured by the Peak signal-to-noise ratio (PSNR). It was observed that the watermarked medical image obtained performing XOR operation between DCT and LSB survived compression attack having a PSNR up to 38.98.

Keywords: watermarking, image processing, DCT, LSB, PSNR

Procedia PDF Downloads 41
2307 Analysis of Total Acid in Arabica Coffee Beans after Fermentation with Ohmic Technology

Authors: Reta

Abstract:

Coffee is widely consumed not only because of its typical taste, but coffee has antioxidant properties because of its polyphenols, and it stimulates brain's performance. The main problem with the consumption of coffee is its content of caffeine. Caffeine, when consumed in excess, can increase muscle tension, stimulate the heart, and increase the secretion of gastric acid. In this research, we applied ohmic-based fermentation technology, which is specially designed to mimic the stomach. We used Arabica coffee, which although cheaper than Luwak coffee, has high acidity, which needs to be reduced. Hence, we applied the ohmic technology, varied the time and temperature of the process and measured the total acidity of the coffee to determine optimum fermentation conditions. Results revealed total acidity of the coffee varied with fermentation conditions; 0.32% at 400C and 12 hr, and 0.52% at 400C and 6 hr. The longer the fermentation, the lower was the acidity. The acidity of the mongoose-fermented (natural fermentation) beans was 2.34%, which is substantially higher than the acidity of the ohmic samples. Ohmic-based fermentation technology, therefore, offers improvements in coffee quality, and this is discussed to highlight the potential of ohmic technology in coffee processing.

Keywords: ohmic technology, fermentation, coffee quality, Arabica coffee

Procedia PDF Downloads 334
2306 The Carbon Footprint Model as a Plea for Cities towards Energy Transition: The Case of Algiers Algeria

Authors: Hachaichi Mohamed Nour El-Islem, Baouni Tahar

Abstract:

Environmental sustainability rather than a trans-disciplinary and a scientific issue, is the main problem that characterizes all modern cities nowadays. In developing countries, this concern is expressed in a plethora of critical urban ills: traffic congestion, air pollution, noise, urban decay, increase in energy consumption and CO2 emissions which blemish cities’ landscape and might threaten citizens’ health and welfare. As in the same manner as developing world cities, the rapid growth of Algiers’ human population and increasing in city scale phenomena lead eventually to increase in daily trips, energy consumption and CO2 emissions. In addition, the lack of proper and sustainable planning of the city’s infrastructure is one of the most relevant issues from which Algiers suffers. The aim of this contribution is to estimate the carbon deficit of the City of Algiers, Algeria, using the Ecological Footprint Model (carbon footprint). In order to achieve this goal, the amount of CO2 from fuel combustion has been calculated and aggregated into five sectors (agriculture, industry, residential, tertiary and transportation); as well, Algiers’ biocapacity (CO2 uptake land) has been calculated to determine the ecological overshoot. This study shows that Algiers’ transport system is not sustainable and is generating more than 50% of Algiers total carbon footprint which cannot be sequestered by the local forest land. The aim of this research is to show that the Carbon Footprint Assessment might be a relevant indicator to design sustainable strategies/policies striving to reduce CO2 by setting in motion the energy consumption in the transportation sector and reducing the use of fossil fuels as the main energy input.

Keywords: biocapacity, carbon footprint, ecological footprint assessment, energy consumption

Procedia PDF Downloads 143
2305 The Effect of Metal Transfer Modes on Mechanical Properties of 3CR12 Stainless Steel

Authors: Abdullah Kaymakci, Daniel M. Madyira, Ntokozo Nkwanyana

Abstract:

The effect of metal transfer modes on mechanical properties of welded 3CR12 stainless steel were investigated. This was achieved by butt welding 10 mm thick plates of 3CR12 in different positions while varying the welding positions for different metal transfer modes. The ASME IX: 2010 (Welding and Brazing Qualifications) code was used as a basis for welding variables. The material and the thickness of the base metal were kept constant together with the filler metal, shielding gas and joint types. The effect of the metal transfer modes on the microstructure and the mechanical properties of the 3CR12 steel was then investigated as it was hypothesized that the change in welding positions will affect the transfer modes partly due to the effect of gravity. The microscopic examination revealed that the substrate was characterized by dual phase microstructure, that is, alpha phase and beta phase grain structures. Using the spectroscopic examination results and the ferritic factor calculation had shown that the microstructure was expected to be ferritic-martensitic during air cooling process. The tested tensile strength and Charpy impact energy were measured to be 498 MPa and 102 J which were in line with mechanical properties given in the material certificate. The heat input in the material was observed to be greater than 1 kJ/mm which is the limiting factor for grain growth during the welding process. Grain growths were observed in the heat affected zone of the welded materials. Ferritic-martensitic microstructure was observed in the microstructure during the microscopic examination. The grain growth altered the mechanical properties of the test material. Globular down hand had higher mechanical properties than spray down hand. Globular vertical up had better mechanical properties than globular vertical down.

Keywords: welding, metal transfer modes, stainless steel, microstructure, hardness, tensile strength

Procedia PDF Downloads 247
2304 Amplitude and Latency of P300 Component from Auditory Stimulus in Different Types of Personality: An Event Related Potential Study

Authors: Nasir Yusoff, Ahmad Adamu Adamu, Tahamina Begum, Faruque Reza

Abstract:

The P300 from Event related potential (ERP) explains the psycho-physiological phenomenon in human body. The present study aims to identify the differences of amplitude and latency of P300 component from auditory stimuli, between ambiversion and extraversion types of personality. Ambivert (N=20) and extravert (N=20) undergoing ERP recording at the Hospital Universiti Sains Malaysia (HUSM) laboratory. Electroencephalogram data was recorded with oddball paradigm, counting auditory standard and target tones, from nine electrode sites (Fz, Cz, Pz, T3, T4, T5, T6, P3 and P4) by using the 128 HydroCel Geodesic Sensor Net. The P300 latency of the target tones at all electrodes were insignificant. Similarly, the P300 latency of the standard tones were also insignificant except at Fz and T3 electrode. Likewise, the P300 amplitude of the target and standard tone in all electrode sites were insignificant. Extravert and ambivert indicate similar characteristic in cognition processing from auditory task.

Keywords: amplitude, event related potential, p300 component, latency

Procedia PDF Downloads 368
2303 Study on Monitoring Techniques Developed for a City Railway Construction

Authors: Myoung-Jin Lee, Sung-Jin Lee, Young-Kon Park, Jin-Wook Kim, Bo-Kyoung Kim, Song-Hun Chong, Sun-Il Kim

Abstract:

Currently, sinkholes may occur due to natural or unknown causes. When the sinkhole is an instantaneous phenomenon, most accidents occur because of significant damage. Thus, methods of monitoring are being actively researched, such that the impact of the accident can be mitigated. A sinkhole can severely affect and wreak havoc in community-based facilities such as a city railway construction. Therefore, the development of a laser / scanning system and an image-based tunnel is one method of pre-monitoring that it stops the accidents. The laser scanning is being used but this has shortcomings as it involves the development of expensive equipment. A laser / videobased scanning tunnel is being developed at Korea Railroad Research Institute. This is designed to automatically operate the railway. The purpose of the scanning is to obtain an image of the city such as of railway structures (stations, tunnel). At the railway structures, it has developed 3D laser scanning that can find a micro-crack can not be distinguished by the eye. An additional aim is to develop technology to monitor the status of the railway structure without the need for expensive post-processing of 3D laser scanning equipment, by developing corresponding software.

Keywords: 3D laser scanning, sinkhole, tunnel, city railway construction

Procedia PDF Downloads 426
2302 How Does the Interaction between Environmental and Intellectual Property Rights Affect Environmental Innovation? A Study of Seven OECD Countries

Authors: Aneeq Sarwar

Abstract:

This study assesses the interaction between environmental and intellectual property policy on the rate of invention of environmental inventions and specifically tests for whether there is a synergy between stricter IP regimes and stronger environmental policies. The empirical analysis uses firm and industry-level data from seven OECD countries from 2009 to 2015. We also introduce a new measure of environmental inventions using a Natural Language Processing Topic Modelling technique. We find that intellectual property policy strictness demonstrates greater effectiveness in encouraging inventiveness in environmental inventions when used in combination with stronger environmental policies. This study contributes to existing literature in two ways. First, it devises a method for better identification of environmental technologies, we demonstrate how our method is more comprehensive than existing methods as we are better able to identify not only environmental inventions, but also major components of said inventions. Second, we test how various policy regimes affect the development of environmental technologies, we are the first study to examine the interaction of the environmental and intellectual property policy on firm level innovation.

Keywords: environmental economics, economics of innovation, environmental policy, firm level

Procedia PDF Downloads 147
2301 Spelling Errors in Persian Children with Developmental Dyslexia

Authors: Mohammad Haghighi, Amineh Akhondi, Leila Jahangard, Mohammad Ahmadpanah, Masoud Ansari

Abstract:

Background: According to the recent estimation, approximately 4%-12% percent of Iranians have difficulty in learning to read and spell possibly as a result of developmental dyslexia. The study was planned to investigate spelling error patterns among Persian children with developmental dyslexia and compare that with the errors exhibited by control groups Participants: 90 students participated in this study. 30 students from Grade level five, diagnosed as dyslexics by professionals, 30 normal 5th Grade readers and 30 younger normal readers. There were 15 boys and 15 girls in each of the groups. Qualitative and quantitative methods for analysis of errors were used. Results and conclusion: results of this study indicate similar spelling error profiles among dyslexics and the reading level matched groups, and these profiles were different from age-matched group. However, performances of dyslexic group and reading level matched group were different and inconsistent in some cases.

Keywords: spelling, error types, developmental dyslexia, Persian, writing system, learning disabilities, processing

Procedia PDF Downloads 421
2300 Achieving Environmentally Sustainable Supply Chain in Textile and Apparel Industries

Authors: Faisal Bin Alam

Abstract:

Most of the manufacturing entities cause negative footprint to nature that demand due attention. Textile industries have one of the longest supply chains and bear the liability of significant environmental impact to our planet. Issues of environmental safety, scarcity of energy and resources, and demand for eco-friendly products have driven research to search for safe and suitable alternatives in apparel processing. Consumer awareness, increased pressure from fashion brands and actions from local legislative authorities have somewhat been able to improve the practices. Objective of this paper is to reveal the best selection of raw materials and methods of production, taking environmental sustainability into account. Methodology used in this study is exploratory in nature based on personal experience, field visits in the factories of Bangladesh and secondary sources. Findings are limited to exploring better alternatives to conventional operations of a Readymade Garment manufacturing, from fibre selection to final product delivery, therefore showing some ways of achieving greener environment in the supply chain of a clothing industry.

Keywords: textile and apparel, environmental sustainability, supply chain, production, clothing

Procedia PDF Downloads 133
2299 Graphic User Interface Design Principles for Designing Augmented Reality Applications

Authors: Afshan Ejaz, Syed Asim Ali

Abstract:

The reality is a combination of perception, reconstruction, and interaction. Augmented Reality is the advancement that layer over consistent everyday existence which includes content based interface, voice-based interfaces, voice-based interface and guide based or gesture-based interfaces, so designing augmented reality application interfaces is a difficult task for the maker. Designing a user interface which is not only easy to use and easy to learn but its more interactive and self-explanatory which have high perceived affordability, perceived usefulness, consistency and high discoverability so that the user could easily recognized and understand the design. For this purpose, a lot of interface design principles such as learnability, Affordance, Simplicity, Memorability, Feedback, Visibility, Flexibly and others are introduced but there no such principles which explain the most appropriate interface design principles for designing an Augmented Reality application interfaces. Therefore, the basic goal of introducing design principles for Augmented Reality application interfaces is to match the user efforts and the computer display (‘plot user input onto computer output’) using an appropriate interface action symbol (‘metaphors’) or to make that application easy to use, easy to understand and easy to discover. In this study by observing Augmented reality system and interfaces, few of well-known design principle related to GUI (‘user-centered design’) are identify and through them, few issues are shown which can be determined through the design principles. With the help of multiple studies, our study suggests different interface design principles which makes designing Augmented Reality application interface more easier and more helpful for the maker as these principles make the interface more interactive, learnable and more usable. To accomplish and test our finding, Pokémon Go an Augmented Reality game was selected and all the suggested principles are implement and test on its interface. From the results, our study concludes that our identified principles are most important principles while developing and testing any Augmented Reality application interface.

Keywords: GUI, augmented reality, metaphors, affordance, perception, satisfaction, cognitive burden

Procedia PDF Downloads 164
2298 Computer-Aided Exudate Diagnosis for the Screening of Diabetic Retinopathy

Authors: Shu-Min Tsao, Chung-Ming Lo, Shao-Chun Chen

Abstract:

Most diabetes patients tend to suffer from its complication of retina diseases. Therefore, early detection and early treatment are important. In clinical examinations, using color fundus image was the most convenient and available examination method. According to the exudates appeared in the retinal image, the status of retina can be confirmed. However, the routine screening of diabetic retinopathy by color fundus images would bring time-consuming tasks to physicians. This study thus proposed a computer-aided exudate diagnosis for the screening of diabetic retinopathy. After removing vessels and optic disc in the retinal image, six quantitative features including region number, region area, and gray-scale values etc… were extracted from the remaining regions for classification. As results, all six features were evaluated to be statistically significant (p-value < 0.001). The accuracy of classifying the retinal images into normal and diabetic retinopathy achieved 82%. Based on this system, the clinical workload could be reduced. The examination procedure may also be improved to be more efficient.

Keywords: computer-aided diagnosis, diabetic retinopathy, exudate, image processing

Procedia PDF Downloads 261
2297 Some Codes for Variants in Graphs

Authors: Sofia Ait Bouazza

Abstract:

We consider the problem of finding a minimum identifying code in a graph. This problem was initially introduced in 1998 and has been since fundamentally connected to a wide range of applications (fault diagnosis, location detection …). Suppose we have a building into which we need to place fire alarms. Suppose each alarm is designed so that it can detect any fire that starts either in the room in which it is located or in any room that shares a doorway with the room. We want to detect any fire that may occur or use the alarms which are sounding to not only to not only detect any fire but be able to tell exactly where the fire is located in the building. For reasons of cost, we want to use as few alarms as necessary. The first problem involves finding a minimum domination set of a graph. If the alarms are three state alarms capable of distinguishing between a fire in the same room as the alarm and a fire in an adjacent room, we are trying to find a minimum locating domination set. If the alarms are two state alarms that can only sound if there is a fire somewhere nearby, we are looking for a differentiating domination set of a graph. These three areas are the subject of much active research; we primarily focus on the third problem. An identifying code of a graph G is a dominating set C such that every vertex x of G is distinguished from other vertices by the set of vertices in C that are at distance at most r≥1 from x. When only vertices out of the code are asked to be identified, we get the related concept of a locating dominating set. The problem of finding an identifying code (resp a locating dominating code) of minimum size is a NP-hard problem, even when the input graph belongs to a number of specific graph classes. Therefore, we study this problem in some restricted classes of undirected graphs like split graph, line graph and path in a directed graph. Then we present some results on the identifying code by giving an exact value of upper total locating domination and a total 2-identifying code in directed and undirected graph. Moreover we determine exact values of locating dominating code and edge identifying code of thin headless spider and locating dominating code of complete suns.

Keywords: identiying codes, locating dominating set, split graphs, thin headless spider

Procedia PDF Downloads 474
2296 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 217
2295 Analysis of Two-Echelon Supply Chain with Perishable Items under Stochastic Demand

Authors: Saeed Poormoaied

Abstract:

Perishability and developing an intelligent control policy for perishable items are the major concerns of marketing managers in a supply chain. In this study, we address a two-echelon supply chain problem for perishable items with a single vendor and a single buyer. The buyer adopts an aged-based continuous review policy which works by taking both the stock level and the aging process of items into account. The vendor works under the warehouse framework, where its lot size is determined with respect to the batch size of the buyer. The model holds for a positive and fixed lead time for the buyer, and zero lead time for the vendor. The demand follows a Poisson process and any unmet demand is lost. We provide exact analytic expressions for the operational characteristics of the system by using the renewal reward theorem. Items have a fixed lifetime after which they become unusable and are disposed of from the buyer's system. The age of items starts when they are unpacked and ready for the consumption at the buyer. When items are held by the vendor, there is no aging process which results in no perishing at the vendor's site. The model is developed under the centralized framework, which takes the expected profit of both vendor and buyer into consideration. The goal is to determine the optimal policy parameters under the service level constraint at the retailer's site. A sensitivity analysis is performed to investigate the effect of the key input parameters on the expected profit and order quantity in the supply chain. The efficiency of the proposed age-based policy is also evaluated through a numerical study. Our results show that when the unit perishing cost is negligible, a significant cost saving is achieved.

Keywords: two-echelon supply chain, perishable items, age-based policy, renewal reward theorem

Procedia PDF Downloads 139
2294 Effective Work Roll Cooling toward Stand Reduction in Hot Strip Process

Authors: Temsiri Sapsaman, Anocha Bhocarattanahkul

Abstract:

The maintenance of work rolls in hot strip processing has been lengthy and difficult tasks for hot strip manufacturer because heavy work rolls have to be taken out of the production line, which could take hours. One way to increase the time between maintenance is to improve the effectiveness of the work roll cooling system such that the wear and tear more slowly occurs, while the operation cost is kept low. Therefore, this study aims to improve the work roll cooling system by providing the manufacturer the relationship between the work-roll temperature reduced by cooling and the water flow that can help manufacturer determining the more effective water flow of the cooling system. The relationship is found using simulation with a systematic process adjustment so that the satisfying quality of product is achieved. Results suggest that the manufacturer could reduce the water flow by 9% with roughly the same performance. With the same process adjustment, the feasibility of finishing-mill-stand reduction is also investigated. Results suggest its possibility.

Keywords: work-roll cooling system, hot strip process adjustment, feasibility study, stand reduction

Procedia PDF Downloads 366
2293 [Keynote]: No-Trust-Zone Architecture for Securing Supervisory Control and Data Acquisition

Authors: Michael Okeke, Andrew Blyth

Abstract:

Supervisory Control And Data Acquisition (SCADA) as the state of the art Industrial Control Systems (ICS) are used in many different critical infrastructures, from smart home to energy systems and from locomotives train system to planes. Security of SCADA systems is vital since many lives depend on it for daily activities and deviation from normal operation could be disastrous to the environment as well as lives. This paper describes how No-Trust-Zone (NTZ) architecture could be incorporated into SCADA Systems in order to reduce the chances of malicious intent. The architecture is made up of two distinctive parts which are; the field devices such as; sensors, PLCs pumps, and actuators. The second part of the architecture is designed following lambda architecture, which is made up of a detection algorithm based on Particle Swarm Optimization (PSO) and Hadoop framework for data processing and storage. Apache Spark will be a part of the lambda architecture for real-time analysis of packets for anomalies detection.

Keywords: industrial control system (ics, no-trust-zone (ntz), particle swarm optimisation (pso), supervisory control and data acquisition (scada), swarm intelligence (SI)

Procedia PDF Downloads 338
2292 Energy Conversion from Waste Paper Industry Using Fluidized Bed Combustion

Authors: M. Dyah Ayu Yuli, S. Faisal Dhio, P. Johandi, P. Muhammad Sofyan

Abstract:

Pulp and paper mills generate various quantities of energy-rich biomass as wastes, depending on technological level, pulp and paper grades and wood quality. These wastes are produced in all stages of the process: wood preparation, pulp and paper manufacture, chemical recovery, recycled paper processing, waste water treatment. Energy recovery from wastes of different origin has become a generally accepted alternative to their disposal. Pulp and paper industry expresses an interest in adapting and integrating advanced biomass energy conversion technologies into its mill operations using Fluidized Bed Combustion. Industrial adoption of these new technologies has the potential for higher efficiency, lower capital cost, and safer operation than conventional operations that burn fossil fuels for energy. Incineration with energy recovery has the advantage of hygienic disposal, volume reduction, and the recovery of thermal energy by means of steam or super heated water that can be used for heating and power generation.

Keywords: biomass, fluidized bed combustion, pulp and paper mills, waste

Procedia PDF Downloads 469
2291 Comparative Study on Daily Discharge Estimation of Soolegan River

Authors: Redvan Ghasemlounia, Elham Ansari, Hikmet Kerem Cigizoglu

Abstract:

Hydrological modeling in arid and semi-arid regions is very important. Iran has many regions with these climate conditions such as Chaharmahal and Bakhtiari province that needs lots of attention with an appropriate management. Forecasting of hydrological parameters and estimation of hydrological events of catchments, provide important information that used for design, management and operation of water resources such as river systems, and dams, widely. Discharge in rivers is one of these parameters. This study presents the application and comparison of some estimation methods such as Feed-Forward Back Propagation Neural Network (FFBPNN), Multi Linear Regression (MLR), Gene Expression Programming (GEP) and Bayesian Network (BN) to predict the daily flow discharge of the Soolegan River, located at Chaharmahal and Bakhtiari province, in Iran. In this study, Soolegan, station was considered. This Station is located in Soolegan River at 51° 14՜ Latitude 31° 38՜ longitude at North Karoon basin. The Soolegan station is 2086 meters higher than sea level. The data used in this study are daily discharge and daily precipitation of Soolegan station. Feed Forward Back Propagation Neural Network(FFBPNN), Multi Linear Regression (MLR), Gene Expression Programming (GEP) and Bayesian Network (BN) models were developed using the same input parameters for Soolegan's daily discharge estimation. The results of estimation models were compared with observed discharge values to evaluate performance of the developed models. Results of all methods were compared and shown in tables and charts.

Keywords: ANN, multi linear regression, Bayesian network, forecasting, discharge, gene expression programming

Procedia PDF Downloads 556
2290 MHD Non-Newtonian Nanofluid Flow over a Permeable Stretching Sheet with Heat Generation and Velocity Slip

Authors: Rama Bhargava, Mania Goyal

Abstract:

The problem of magnetohydrodynamics boundary layer flow and heat transfer on a permeable stretching surface in a second grade nanofluid under the effect of heat generation and partial slip is studied theoretically. The Brownian motion and thermophoresis effects are also considered. The boundary layer equations governed by the PDE’s are transformed into a set of ODE’s with the help of local similarity transformations. The differential equations are solved by variational finite element method. The effects of different controlling parameters on the flow field and heat transfer characteristics are examined. The numerical results for the dimensionless velocity, temperature and nanoparticle volume fraction as well as the reduced Nusselt and Sherwood number have been presented graphically. The comparison confirmed excellent agreement. The present study is of great interest in coating and suspensions, cooling of metallic plate, oils and grease, paper production, coal water or coal-oil slurries, heat exchangers technology, materials processing exploiting.

Keywords: viscoelastic nanofluid, partial slip, stretching sheet, heat generation/absorption, MHD flow, FEM

Procedia PDF Downloads 308
2289 Review of Research on Waste Plastic Modified Asphalt

Authors: Song Xinze, Cai Kejian

Abstract:

To further explore the application of waste plastics in asphalt pavement, this paper begins with the classification and characteristics of waste plastics. It then provides a state-of-the-art review of the preparation methods and processes of waste plastic modifiers, waste plastic-modified asphalt, and waste plastic-modified asphalt mixtures. The paper also analyzes the factors influencing the compatibility between waste plastics and asphalt and summarizes the performance evaluation indicators for waste plastic-modified asphalt and its mixtures. It explores the research approaches and findings of domestic and international scholars and presents examples of waste plastics applications in pavement engineering. The author believes that there is a basic consensus that waste plastics can improve the high-temperature performance of asphalt. The use of cracking processes to solve the storage stability of waste plastic polymer-modified asphalt is the key to promoting its application. Additionally, the author anticipates that future research will concentrate on optimizing the recycling, processing, screening, and preparation of waste plastics, along with developing composite plastic modifiers to improve their compatibility and long-term performance in asphalt pavements.

Keywords: waste plastics, asphalt pavement, asphalt performance, asphalt modification

Procedia PDF Downloads 31
2288 Coordinated Community Response to Intimate Partner Violence on College Campuses

Authors: Robert D. Hanser, Gina M. Hanser

Abstract:

This paper provides an overview of Coordinated Community Response Teams (CCRT) to Intimate Partner Violence (IPV). The CCRT, as a partnership and collaborative effort between multiple agencies is highlighted. This paper is a legal analysis that showcases new legislation and legal requirements in the United States for investigating, processing, and reporting to acts of victimization have transformed the role of the university’s CCRT on campus, making its mission all the more important, both internal and external to the campus. As a specific example, discussion of the CCRT in Northeast Louisiana at the University of Louisiana at Monroe is provided as an example of involvement in this initiative, where federal grant funding has allowed a micro version of the region’s CCRT to be implemented on that campus. Simultaneously, university personnel also work with external agencies throughout the community in intimate partner violence response. Amidst this, the result is a genuine partnership between practitioners and researchers who work together to provide public awareness, prevention, first-responder, and intervention services in a comprehensive manner throughout Northeast Louisiana.

Keywords: interperaonal violence, sexual assault, dating violence, campus violence

Procedia PDF Downloads 303
2287 A Data-Driven Agent Based Model for the Italian Economy

Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio

Abstract:

We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.

Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data

Procedia PDF Downloads 66
2286 Study on Novel Reburning Process for NOx Reduction by Oscillating Injection of Reburn Fuel

Authors: Changyeop Lee, Sewon Kim, Jongho Lee

Abstract:

Reburning technology has been developed to adopt various commercial combustion systems. Fuel lean reburning is an advanced reburning method to reduce NOx economically without using burnout air, however it is not easy to get high NOx reduction efficiency. In the fuel lean reburning system, the localized fuel rich eddies are used to establish partial fuel rich regions so that the NOx can react with hydrocarbon radical restrictively. In this paper, a new advanced reburning method which supplies reburn fuel with oscillatory motion is introduced to increase NOx reduction rate effectively. To clarify whether forced oscillating injection of reburn fuel can effectively reduce NOx emission, experimental tests were conducted in vertical combustion furnace. Experiments were performed in flames stabilized by a gas burner, which was mounted at the bottom of the furnace. The natural gas is used as both main and reburn fuel and total thermal input is about 40kW. The forced oscillating injection of reburn fuel is realized by electronic solenoid valve, so that fuel rich region and fuel lean region is established alternately. In the fuel rich region, NOx is converted to N2 by reburning reaction, however unburned hydrocarbon and CO is oxidized in fuel lean zone and mixing zone at downstream where slightly fuel lean region is formed by mixing of two regions. This paper reports data on flue gas emissions and temperature distribution in the furnace for a wide range of experimental conditions. All experimental data has been measured at steady state. The NOx reduction rate increases up to 41% by forced oscillating reburn motion. The CO emissions were shown to be kept at very low level. And this paper makes clear that in order to decrease NOx concentration in the exhaust when oscillating reburn fuel injection system is adopted, the control of factors such as frequency and duty ratio is very important.

Keywords: NOx, CO, reburning, pollutant

Procedia PDF Downloads 287
2285 Operating System Support for Mobile Device Thermal Management and Performance Optimization in Augmented Reality Applications

Authors: Yasith Mindula Saipath Wickramasinghe

Abstract:

Augmented reality applications require a high processing power to load, render and live stream high-definition AR models and virtual scenes; it also requires device sensors to work excessively to coordinate with internal hardware, OS and give the expected outcome in advance features like object detection, real time tracking, as well as voice and text recognition. Excessive thermal generation due to these advanced functionalities has become a major research problem as it is unbearable for smaller mobile devices to manage such heat increment and battery drainage as it causes physical harm to the devices in the long term. Therefore, effective thermal management is one of the major requirements in Augmented Reality application development. As this paper discusses major causes for this issue, it also provides possible solutions in the means of operating system adaptations as well as further research on best coding practises to optimize the application performance that reduces thermal excessive thermal generation.

Keywords: augmented reality, device thermal management, GPU, operating systems, device I/O, overheating

Procedia PDF Downloads 114
2284 Instructional Consequences of the Transiency of Spoken Words

Authors: Slava Kalyuga, Sujanya Sombatteera

Abstract:

In multimedia learning, written text is often transformed into spoken (narrated) text. This transient information may overwhelm limited processing capacity of working memory and inhibit learning instead of improving it. The paper reviews recent empirical studies in modality and verbal redundancy effects within a cognitive load framework and outlines conditions under which negative effects of transiency may occur. According to the modality effect, textual information accompanying pictures should be presented in an auditory rather than visual form in order to engage two available channels of working memory – auditory and visual - instead of only one of them. However, some studies failed to replicate the modality effect and found differences opposite to those expected. Also, according to the multimedia redundancy effect, the same information should not be presented simultaneously in different modalities to avoid unnecessary cognitive load imposed by the integration of redundant sources of information. However, a few studies failed to replicate the multimedia redundancy effect too. Transiency of information is used to explain these controversial results.

Keywords: cognitive load, transient information, modality effect, verbal redundancy effect

Procedia PDF Downloads 375
2283 Improvement of Brain Tumors Detection Using Markers and Boundaries Transform

Authors: Yousif Mohamed Y. Abdallah, Mommen A. Alkhir, Amel S. Algaddal

Abstract:

This was experimental study conducted to study segmentation of brain in MRI images using edge detection and morphology filters. For brain MRI images each film scanned using digitizer scanner then treated by using image processing program (MatLab), where the segmentation was studied. The scanned image was saved in a TIFF file format to preserve the quality of the image. Brain tissue can be easily detected in MRI image if the object has sufficient contrast from the background. We use edge detection and basic morphology tools to detect a brain. The segmentation of MRI images steps using detection and morphology filters were image reading, detection entire brain, dilation of the image, filling interior gaps inside the image, removal connected objects on borders and smoothen the object (brain). The results of this study were that it showed an alternate method for displaying the segmented object would be to place an outline around the segmented brain. Those filters approaches can help in removal of unwanted background information and increase diagnostic information of Brain MRI.

Keywords: improvement, brain, matlab, markers, boundaries

Procedia PDF Downloads 511
2282 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach

Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar

Abstract:

Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.

Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI

Procedia PDF Downloads 148
2281 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods

Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun

Abstract:

Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.

Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics

Procedia PDF Downloads 465
2280 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 228