Search results for: data driven decision making
26387 Agricultural Land Suitability Analysis of Kampe-Omi Irrigation Scheme Using Remote Sensing and Geographic Information System
Authors: Olalekan Sunday Alabi, Titus Adeyemi Alonge, Olumuyiwa Idowu Ojo
Abstract:
Agricultural land suitability analysis and mapping play an imperative role for sustainable utilization of scarce physical land resources. The objective of this study was to prepare spatial database of physical land resources for irrigated agriculture and to assess land suitability for irrigation and developing suitable area map of the study area. The study was conducted at Kampe-Omi irrigation scheme located at Yagba West Local Government Area of Kogi State, Nigeria. Temperature and rainfall data of the study area were collected for 10 consecutive years (2005-2014). Geographic Information System (GIS) techniques were used to develop irrigation land suitability map of the study area. Attribute parameters such as the slope, soil properties, topography of the study area were used for the analysis. The available data were arranged, proximity analysis of Arc-GIS was made, and this resulted into five mapping units. The final agricultural land suitability map of the study area was derived after overlay analysis. Based on soil composition, slope, soil properties and topography, it was concluded that; Kampe-Omi has rich sandy loam soil, which is viable for agricultural purpose, the soil composition is made up of 60% sand and 40% loam. The land-use pattern map of Kampe-Omi has vegetal area and water-bodies covering 55.6% and 19.3% of the total assessed area respectively. The landform of Kampe-Omi is made up of 41.2% lowlands, 37.5% normal lands and 21.3% highlands. Kampe-Omi is adequately suitable for agricultural purpose while an extra of 20.2% of the area is highly suitable for agricultural purpose making 72.6% while 18.7% of the area is slightly suitable.Keywords: remote sensing, GIS, Kampe–Omi, land suitability, mapping
Procedia PDF Downloads 22026386 Cultural Tourism, The Gateway to Socioeconomic Development in Nigeria: Case Study on Osun State Nigeria
Authors: Osinubi Olufemi Bankole
Abstract:
Cultural tourism is an industry committed to making a low impact on the environment, locale culture, festival, etc. while helping to generate income and employment opportunities for the locale. Tourists who promote cultural tourism are sensitive to the cultural belief and norms that are gradually going into extinction and the rich cultural resources that abound in Nigeria. The paper focus on culture been a unique way of life of particular people that differentiates them from their neighbors. It examined the socioeconomic roles of cultural tourism to the development of Nigeria using Osun state as case study. The data collected were analyzed using simple percentage method, result shows that 35 respondents representing 87.5% agreed that cultural tourism has a significant role to play in the socioeconomic development of Nigeria. The study concluded that cultural tourism is an important aspect of the nation’s economic sector that should be given adequate consideration for economic sustainability. The researcher recommended that various investment opportunities abound in the nation’s cultural resources if well developed and maintained.Keywords: culture, development, industry, tourism
Procedia PDF Downloads 38126385 Possibilities to Evaluate the Climatic and Meteorological Potential for Viticulture in Poland: The Case Study of the Jagiellonian University Vineyard
Authors: Oskar Sekowski
Abstract:
Current global warming causes changes in the traditional zones of viticulture worldwide. During 20th century, the average global air temperature increased by 0.89˚C. The models of climate change indicate that viticulture, currently concentrating in narrow geographic niches, may move towards the poles, to higher geographic latitudes. Global warming may cause changes in traditional viticulture regions. Therefore, there is a need to estimate the climatic conditions and climate change in areas that are not traditionally associated with viticulture, e.g., Poland. The primary objective of this paper is to prepare methodology to evaluate the climatic and meteorological potential for viticulture in Poland based on a case study. Moreover, the additional aim is to evaluate the climatic potential of a mesoregion where a university vineyard is located. The daily data of temperature, precipitation, insolation, and wind speed (1988-2018) from the meteorological station located in Łazy, southern Poland, was used to evaluate 15 climatological parameters and indices connected with viticulture. The next steps of the methodology are based on Geographic Information System methods. The topographical factors such as a slope gradient and slope exposure were created using Digital Elevation Models. The spatial distribution of climatological elements was interpolated by ordinary kriging. The values of each factor and indices were also ranked and classified. The viticultural potential was determined by integrating two suitability maps, i.e., the topographical and climatic ones, and by calculating the average for each pixel. Data analysis shows significant changes in heat accumulation indices that are driven by increases in maximum temperature, mostly increasing number of days with Tmax > 30˚C. The climatic conditions of this mesoregion are sufficient for vitis vinifera viticulture. The values of indicators and insolation are similar to those in the known wine regions located on similar geographical latitudes in Europe. The smallest threat to viticulture in study area is the occurrence of hail and the highest occurrence of frost in the winter. This research provides the basis for evaluating general suitability and climatologic potential for viticulture in Poland. To characterize the climatic potential for viticulture, it is necessary to assess the suitability of all climatological and topographical factors that can influence viticulture. The methodology used in this case study shows places where there is a possibility to create vineyards. It may also be helpful for wine-makers to select grape varieties.Keywords: climatologic potential, climatic classification, Poland, viticulture
Procedia PDF Downloads 10926384 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 37426383 A Study on Accident Result Contribution of Individual Major Variables Using Multi-Body System of Accident Reconstruction Program
Authors: Donghun Jeong, Somyoung Shin, Yeoil Yun
Abstract:
A large-scale traffic accident refers to an accident in which more than three people die or more than thirty people are dead or injured. In order to prevent a large-scale traffic accident from causing a big loss of lives or establish effective improvement measures, it is important to analyze accident situations in-depth and understand the effects of major accident variables on an accident. This study aims to analyze the contribution of individual accident variables to accident results, based on the accurate reconstruction of traffic accidents using PC-Crash’s Multi-Body, which is an accident reconstruction program, and simulation of each scenario. Multi-Body system of PC-Crash accident reconstruction program is used for multi-body accident reconstruction that shows motions in diverse directions that were not approached previously. MB System is to design and reproduce a form of body, which shows realistic motions, using several bodies. Targeting the 'freight truck cargo drop accident around the Changwon Tunnel' that happened in November 2017, this study conducted a simulation of the freight truck cargo drop accident and analyzed the contribution of individual accident majors. Then on the basis of the driving speed, cargo load, and stacking method, six scenarios were devised. The simulation analysis result displayed that the freight car was driven at a speed of 118km/h(speed limit: 70km/h) right before the accident, carried 196 oil containers with a weight of 7,880kg (maximum load: 4,600kg) and was not fully equipped with anchoring equipment that could prevent a drop of cargo. The vehicle speed, cargo load, and cargo anchoring equipment were major accident variables, and the accident contribution analysis results of individual variables are as follows. When the freight car only obeyed the speed limit, the scattering distance of oil containers decreased by 15%, and the number of dropped oil containers decreased by 39%. When the freight car only obeyed the cargo load, the scattering distance of oil containers decreased by 5%, and the number of dropped oil containers decreased by 34%. When the freight car obeyed both the speed limit and cargo load, the scattering distance of oil containers fell by 38%, and the number of dropped oil containers fell by 64%. The analysis result of each scenario revealed that the overspeed and excessive cargo load of the freight car contributed to the dispersion of accident damage; in the case of a truck, which did not allow a fall of cargo, there was a different type of accident when driven too fast and carrying excessive cargo load, and when the freight car obeyed the speed limit and cargo load, there was the lowest possibility of causing an accident.Keywords: accident reconstruction, large-scale traffic accident, PC-Crash, MB system
Procedia PDF Downloads 20426382 Evidence of Behavioural Thermoregulation by Dugongs (Dugong dugon) at the High Latitude Limit to Their Range in Eastern Australia
Authors: Daniel R. Zeh, Michelle R. Heupel, Mark Hamann, Rhondda Jones, Colin J. Limpus, Helene Marsh
Abstract:
Marine mammals live in an environment with water temperatures nearly always lower than the mammalian core body temperature of 35 - 38°C. Marine mammals can lose heat at high rates and have evolved a range of adaptations to minimise heat loss. Our project tracked dugongs to examine if there was a discoverable relationship between the animals’ movements and the temperature of their environment that might suggest behavioural thermoregulation. Twenty-nine dugongs were fitted with acoustic and satellite/GPS transmitters in 2012, 2013 and 2014 in Moreton Bay Queensland at the high latitude limit of the species’ winter range in eastern Australia on 30 occasions (one animal was tagged twice). All 22 animals that stayed in the area and had functional transmitters made at least one (and up to 66) return trip(s) to the warmer oceanic waters outside the bay where seagrass is unavailable. Individual dugongs went in and out of the bay in synchrony with the tides and typically spent about 6 hours in the oceanic water. There was a diel pattern in the movements: 85% of outgoing trips occurred between midnight and noon. There were significant individual differences, but the likelihood of a dugong leaving the bay was independent of body length or sex. In Quarter 2 (April – June), the odds of a dugong making a trip increased by about 40% for each 1°C increase in the temperature difference between the bay and the warmer adjacent oceanic waters. In Quarter 3, the odds of making a trip were lower when the outside –inside bay temperature differences were small or negative but increased by a factor of up to 2.12 for each 1°C difference in outside – inside temperatures. In Quarter 4, the odds of making a trip were higher when it was cooler outside the bay and decreased by a factor of nearly 0.5 for each 1°C difference in outside – inside bay temperatures. The activity spaces of the dugongs generally declined as winter progressed suggesting a change in the cost-effectiveness of moving outside the bay. Our analysis suggests that dugongs can thermoregulate their core temperature through the behaviour of moving to water having more favourable temperature.Keywords: acoustic, behavioral thermoregulation, dugongs, movements, satellite, telemetry, quick fix GPS
Procedia PDF Downloads 17626381 Active Abdominal Compression Device for Treatment of Orthostatic Hypotension
Authors: Vishnu Emani, Andreas Escher, Ellen Roche
Abstract:
Background: Orthostatic hypotension (OH) is an autonomic disorder marked by a sudden drop in blood pressure upon standing resulting from autonomic dysfunction. OH is especially prevalent in elderly populations, affecting more than 30% of Americans over the age of 70. OH is one of the most significant risk factors for accidental falls in elderly populations, making it a crucial focus for medical and device therapies. Pharmacologic therapy with midodrine and fludrocortisone may alleviate hypotension but have significant adverse side effects. Abdominal passive compression devices (binders) are more effective than lower extremity compression stockings at mitigating postural hypotension, by improving venous return to the heart. However, abdominal binders are difficult to don and uncomfortable to wear, leading to poor compliance. A disadvantage of passive compression devices is their inability to selectively compress during the crucial moment of standing. We have recently developed an active compression device that applies external pressure on the abdomen during the transition from a prone to a supine position and conducted initial prototype testing. Methods: An active abdominal compression device was developed utilizing a simple, servo-driven straptightening mechanism to supply tension onto foam fabric, which applies pressure to the abdomen. Healthy volunteers (n=5) were utilized for prototype testing and were subjected to three conditions: no compression, passive compression (i.e. standard abdominal binder), and active compression (device prototype). Abdominal applied pressure during device activation was measured by a strain-gauge manometer placed between the skin and binder. Systolic (SBP) and mean (MAP) arterial blood pressure was measured by standard blood pressure cuff in supine position followed by repeat measurements at 1 minute intervals for 5 minutes following upright position. A survey tool was administered to determine scores (1-10) for comfort and ease of donning abdominal binders. Results: Abdominal pressure increased from 0 to 15±3 mmHg upon device activation for both passive and active compression devices. During the transition from supine to an upright position, both active and passive compression devices demonstrated significantly higher MAP compared to the no-compression condition (67±4, 68±5, 62±5 respectively, P<0.05), but there was no statistically significant difference in SBP or MAP when comparing active to passive compression. Active compression demonstrated significantly higher comfort scores (8.3±1) compared to passive compression (3.2±2) but lower when compared to no compression (10). Subjects universally reported that active compression device was easier to don compared to passive device. Conclusions: Active or passive abdominal compression prevents hypotension associated with postural changes. Active compression is associated with increased comfort and ease of donning compared to passive compression devices. Future trials are warranted to investigate the efficacy of our device in patients with OH.Keywords: orthostatic hypotension, compression binder, abdominal binder, active abdominal compression
Procedia PDF Downloads 3426380 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 43626379 The Impact of Quality Cost on Revenue Sharing in Supply Chain Management
Authors: Fayza M. Obied-Allah
Abstract:
Customer’ needs, quality, and value creation while reducing costs through supply chain management provides challenges and opportunities for companies and researchers. In the light of these challenges, modern ideas must contribute to counter these challenges and exploit opportunities. Perhaps this paper will be one of these contributions. This paper discusses the impact of the quality cost on revenue sharing as a most important incentive to configure business networks. No doubt that the costs directly affect the size of income generated by a business network, so this paper investigates the impact of quality costs on business networks revenue, and their impact on the decision to participate the revenue among the companies in the supply chain. This paper develops the quality cost approach to align with the modern era, the developed model includes five categories besides the well-known four categories (namely prevention costs, appraisal costs, internal failure costs, and external failure costs), a new category has been developed in this research as a new vision of the relationship between quality costs and innovations of industry. This new category is Recycle Cost. This paper is organized into six sections, Section I shows quality costs overview in the supply chain. Section II discusses revenue sharing between the parties in supply chain. Section III investigates the impact of quality costs in revenue sharing decision between partners in supply chain. The fourth section includes survey study and presents statistical results. Section V discusses the results and shows future opportunities for research. Finally, Section VI summarizes the theoretical and practical results of this paper.Keywords: quality cost, recycle cost, revenue sharing, supply chain management
Procedia PDF Downloads 45126378 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 25526377 Changing Pattern and Trend of Head of Household in India: Evidence from Various Rounds of National Family Health Survey
Authors: Moslem Hossain, Mukesh Kumar, K. C. Das
Abstract:
Background: Household headship is the crucial decision-maker as well as the economic provider of the household. In Indian society, household heads occupied by men from the pre-colonial period. This study attempt to examine the changes in household headship in India. Methods: The study used univariate and multivariate analysis to examine the trends and patterns of different characteristics of the household head using the various rounds of national family health survey data. Results: The female household head is gradually increasing; on the other hand, the male-dominant is decreasing over the four national family and health surveys. The mean age of the household head is higher in rural areas than urban India. Only ten percentage of Households are higher educated, and 83 percent of the male household head has a low standard of living. The mean family size of the household has a decreasing trend in both the urban and rural areas during the study period. Conclusions: The result indicates that women's autonomy is increasing and leading to inclusive growth, which introduced in the eleven five year plan, especially focuses on the woman and young people in the country.Keywords: household head, national family health survey, mean age, mean family size
Procedia PDF Downloads 13826376 Physical Model Testing of Storm-Driven Wave Impact Loads and Scour at a Beach Seawall
Authors: Sylvain Perrin, Thomas Saillour
Abstract:
The Grande-Motte port and seafront development project on the French Mediterranean coastline entailed evaluating wave impact loads (pressures and forces) on the new beach seawall and comparing the resulting scour potential at the base of the existing and new seawall. A physical model was built at ARTELIA’s hydraulics laboratory in Grenoble (France) to provide insight into the evolution of scouring overtime at the front of the wall, quasi-static and impulsive wave force intensity and distribution on the wall, and water and sand overtopping discharges over the wall. The beach was constituted of fine sand and approximately 50 m wide above mean sea level (MSL). Seabed slopes were in the range of 0.5% offshore to 1.5% closer to the beach. A smooth concrete structure will replace the existing concrete seawall with an elevated curved crown wall. Prior the start of breaking (at -7 m MSL contour), storm-driven maximum spectral significant wave heights of 2.8 m and 3.2 m were estimated for the benchmark historical storm event dated of 1997 and the 50-year return period storms respectively, resulting in 1 m high waves at the beach. For the wave load assessment, a tensor scale measured wave forces and moments and five piezo / piezo-resistive pressure sensors were placed on the wall. Light-weight sediment physical model and pressure and force measurements were performed with scale 1:18. The polyvinyl chloride light-weight particles used to model the prototype silty sand had a density of approximately 1 400 kg/m3 and a median diameter (d50) of 0.3 mm. Quantitative assessments of the seabed evolution were made using a measuring rod and also a laser scan survey. Testing demonstrated the occurrence of numerous impulsive wave impacts on the reflector (22%), induced not by direct wave breaking but mostly by wave run-up slamming on the top curved part of the wall. Wave forces of up to 264 kilonewtons and impulsive pressure spikes of up to 127 kilonewtons were measured. Maximum scour of -0.9 m was measured for the new seawall versus -0.6 m for the existing seawall, which is imputable to increased wave reflection (coefficient was 25.7 - 30.4% vs 23.4 - 28.6%). This paper presents a methodology for the setup and operation of a physical model in order to assess the hydrodynamic and morphodynamic processes at a beach seawall during storms events. It discusses the pros and cons of such methodology versus others, notably regarding structures peculiarities and model effects.Keywords: beach, impacts, scour, seawall, waves
Procedia PDF Downloads 15726375 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.Keywords: electronic health records, electronic emergency department information system, emergency department, data quality
Procedia PDF Downloads 27826374 Comparative Comparison (Cost-Benefit Analysis) of the Costs Caused by the Earthquake and Costs of Retrofitting Buildings in Iran
Authors: Iman Shabanzadeh
Abstract:
Earthquake is known as one of the most frequent natural hazards in Iran. Therefore, policy making to improve the strengthening of structures is one of the requirements of the approach to prevent and reduce the risk of the destructive effects of earthquakes. In order to choose the optimal policy in the face of earthquakes, this article tries to examine the cost of financial damages caused by earthquakes in the building sector and compare it with the costs of retrofitting. In this study, the results of adopting the scenario of "action after the earthquake" and the policy scenario of "strengthening structures before the earthquake" have been collected, calculated and finally analyzed by putting them together. Methodologically, data received from governorates and building retrofitting engineering companies have been used. The scope of the study is earthquakes occurred in the geographical area of Iran, and among them, eight earthquakes have been specifically studied: Miane, Ahar and Haris, Qator, Momor, Khorasan, Damghan and Shahroud, Gohran, Hormozgan and Ezgole. The main basis of the calculations is the data obtained from retrofitting companies regarding the cost per square meter of building retrofitting and the data of the governorate regarding the power of earthquake destruction, the realized costs for the reconstruction and construction of residential units. The estimated costs have been converted to the value of 2021 using the time value of money method to enable comparison and aggregation. The cost-benefit comparison of the two policies of action after the earthquake and retrofitting before the earthquake in the eight earthquakes investigated shows that the country has suffered five thousand billion Tomans of losses due to the lack of retrofitting of buildings against earthquakes. Based on the data of the Budget Law's of Iran, this figure was approximately twice the budget of the Ministry of Roads and Urban Development and five times the budget of the Islamic Revolution Housing Foundation in 2021. The results show that the policy of retrofitting structures before an earthquake is significantly more optimal than the competing scenario. The comparison of the two policy scenarios examined in this study shows that the policy of retrofitting buildings before an earthquake, on the one hand, prevents huge losses, and on the other hand, by increasing the number of earthquake-resistant houses, it reduces the amount of earthquake destruction. In addition to other positive effects of retrofitting, such as the reduction of mortality due to earthquake resistance of buildings and the reduction of other economic and social effects caused by earthquakes. These are things that can prove the cost-effectiveness of the policy scenario of "strengthening structures before earthquakes" in Iran.Keywords: disaster economy, earthquake economy, cost-benefit analysis, resilience
Procedia PDF Downloads 6726373 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset
Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba
Abstract:
We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process
Procedia PDF Downloads 26526372 Solving a Micromouse Maze Using an Ant-Inspired Algorithm
Authors: Rolando Barradas, Salviano Soares, António Valente, José Alberto Lencastre, Paulo Oliveira
Abstract:
This article reviews the Ant Colony Optimization, a nature-inspired algorithm, and its implementation in the Scratch/m-Block programming environment. The Ant Colony Optimization is a part of Swarm Intelligence-based algorithms and is a subset of biological-inspired algorithms. Starting with a problem in which one has a maze and needs to find its path to the center and return to the starting position. This is similar to an ant looking for a path to a food source and returning to its nest. Starting with the implementation of a simple wall follower simulator, the proposed solution uses a dynamic graphical interface that allows young students to observe the ants’ movement while the algorithm optimizes the routes to the maze’s center. Things like interface usability, Data structures, and the conversion of algorithmic language to Scratch syntax were some of the details addressed during this implementation. This gives young students an easier way to understand the computational concepts of sequences, loops, parallelism, data, events, and conditionals, as they are used through all the implemented algorithms. Future work includes the simulation results with real contest mazes and two different pheromone update methods and the comparison with the optimized results of the winners of each one of the editions of the contest. It will also include the creation of a Digital Twin relating the virtual simulator with a real micromouse in a full-size maze. The first test results show that the algorithm found the same optimized solutions that were found by the winners of each one of the editions of the Micromouse contest making this a good solution for maze pathfinding.Keywords: nature inspired algorithms, scratch, micromouse, problem-solving, computational thinking
Procedia PDF Downloads 12826371 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator
Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain
Abstract:
Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.Keywords: percent depth dose, flatness, symmetry, golden beam data
Procedia PDF Downloads 49326370 Improving Power Quality in Wind Power Generation System
Authors: A. Omeiri, A. Djellad, P. O. Logerais, O. Riou, J. F. Durastanti
Abstract:
With the growing of electrical energy demand, wind power capacity has experienced tremendous growth in the past decade, thanks to wind power’s environmental benefits. Direct driven permanent magnet synchronous generator (PMSG) with a full size back-to-back converter set is one of the promising technologies employed with wind power generation. Wind grid integration brings the problems of voltage fluctuation and harmonic pollution. In the present study, the filter is placed between the wind system and the network to reduce the total harmonic distortion (THD) and enhance power quality during disturbances. The models of wind turbine, PMSG, power electronic converters and the filter are implemented in MATLAB/SIMULINK environment.Keywords: wind energy conversion system, PMSG, PWM, THD, power quality, passive filter
Procedia PDF Downloads 65126369 Variable-Fidelity Surrogate Modelling with Kriging
Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans
Abstract:
Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients
Procedia PDF Downloads 56026368 Robust Barcode Detection with Synthetic-to-Real Data Augmentation
Authors: Xiaoyan Dai, Hsieh Yisan
Abstract:
Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.Keywords: barcode detection, data augmentation, deep learning, image-based processing
Procedia PDF Downloads 17626367 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms
Authors: Hai L. Tran
Abstract:
When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.Keywords: data, narrative, number, anecdote, storytelling, news
Procedia PDF Downloads 8326366 Remaining Useful Life Estimation of Bearings Based on Nonlinear Dimensional Reduction Combined with Timing Signals
Authors: Zhongmin Wang, Wudong Fan, Hengshan Zhang, Yimin Zhou
Abstract:
In data-driven prognostic methods, the prediction accuracy of the estimation for remaining useful life of bearings mainly depends on the performance of health indicators, which are usually fused some statistical features extracted from vibrating signals. However, the existing health indicators have the following two drawbacks: (1) The differnet ranges of the statistical features have the different contributions to construct the health indicators, the expert knowledge is required to extract the features. (2) When convolutional neural networks are utilized to tackle time-frequency features of signals, the time-series of signals are not considered. To overcome these drawbacks, in this study, the method combining convolutional neural network with gated recurrent unit is proposed to extract the time-frequency image features. The extracted features are utilized to construct health indicator and predict remaining useful life of bearings. First, original signals are converted into time-frequency images by using continuous wavelet transform so as to form the original feature sets. Second, with convolutional and pooling layers of convolutional neural networks, the most sensitive features of time-frequency images are selected from the original feature sets. Finally, these selected features are fed into the gated recurrent unit to construct the health indicator. The results state that the proposed method shows the enhance performance than the related studies which have used the same bearing dataset provided by PRONOSTIA.Keywords: continuous wavelet transform, convolution neural net-work, gated recurrent unit, health indicators, remaining useful life
Procedia PDF Downloads 13926365 Characteristics of Sorghum (Sorghum bicolor L. Moench) Flour on the Soaking Time of Peeled Grains and Particle Size Treatment
Authors: Sri Satya Antarlina, Elok Zubaidah, Teti Istiana, Harijono
Abstract:
Sorghum bicolor (Sorghum bicolor L. Moench) has the potential as a flour for gluten-free food products. Sorghum flour production needs grain soaking treatment. Soaking can reduce the tannin content which is an anti-nutrient, so it can increase the protein digestibility. Fine particle size decreases the yield of flour, so it is necessary to study various particle sizes to increase the yield. This study aims to determine the characteristics of sorghum flour in the treatment of soaking peeled grain and particle size. The material of white sorghum varieties KD-4 from farmers in East Java, Indonesia. Factorial randomized factorial design (two factors), repeated three times, factor I were the time of grain soaking (five levels) that were 0, 12, 24, 36, and 48 hours, factor II was the size of the starch particles sifted with a fineness level of 40, 60, 80, and 100 mesh. The method of making sorghum flour is grain peeling, soaking peeled grain, drying using the oven at 60ᵒC, milling, and sieving. Physico-chemical analysis of sorghum flour. The results show that there is an interaction between soaking time of grain with the size of sorghum flour particles. Interaction in yield of flour, L* color (brightness level), whiteness index, paste properties, amylose content, protein content, bulk density, and protein digestibility. The method of making sorghum flour through the soaking of peeled grain and the difference in particle size has an important role in producing the physicochemical properties of the specific flour. Based on the characteristics of sorghum flour produced, it is determined the method of making sorghum flour through sorghum grain soaking for 24 hours, the particle size of flour 80 mesh. The sorghum flour with characteristic were 24.88% yield of flour, 88.60 color L* (brightness level), 69.95 whiteness index, 3615 Cp viscosity, 584.10 g/l of bulk density, 24.27% db protein digestibility, 90.02% db starch content, 23.4% db amylose content, 67.45% db amylopectin content, 0.22% db crude fiber content, 0.037% db tannin content, 5.30% db protein content, ash content 0.18% db, carbohydrate content 92.88 % db, and 1.94% db fat content. The sorghum flour is recommended for cookies products.Keywords: characteristic, sorghum (Sorghum bicolor L. Moench) flour, grain soaking, particle size, physicochemical properties
Procedia PDF Downloads 16526364 Exploitation behind the Development of Home Batik Industry in Lawean, Solo, Central Java
Authors: Mukhammad Fatkhullah, Ayla Karina Budita, Cut Rizka Al Usrah, Kanita Khoirun Nisa, Muhammad Alhada Fuadilah Habib, Siti Muslihatul Mukaromah
Abstract:
Batik industry has become one of the leading industries in the economy of Indonesia. Since the recognition of batik as one of cultural wealth and national identity of Indonesia by UNESCO, batik production keeps increasing as a result of increasing demands for batik, whether from domestically or abroad. One of the rapid development batik industries in Indonesia is batik industry in Lawean Village, Solo, Central Java, Indonesia. This batik industry generally uses putting-out system where batik workers work in their own houses. With the implementation of this system, therefore employers don’t have to prepare Environmental Impact Analysis (EIA), social security for workers, overtime payment, space for working, and equipment for working. The implementation of putting-out system causes many problems, starting from environmental pollution, the loss of social rights of workers, and even exploitation of workers by batik entrepreneurs. The data used to describe this reality is the primary data from qualitative research with in-depth interview data collection technique. Informants were determined purposively. The theory used to perform data interpretation is the phenomenology of Alfred Schutz. Both qualitative and phenomenology are used in this study to describe batik workers exploitation in terms of the implementation of putting-out system on home batik industry in Lawean. The research result showed that workers in batik industry sector in Lawean were exploited with the implementation of putting-out system. The workers were strictly employed by the entrepreneurs, so that their job cannot be called 'part-time' job anymore. In terms of labor and time, the workers often work more than 12 hours per day and they often work overtime without receiving any overtime payment. In terms of work safety, the workers often have contact with chemical substances contained in batik making materials without using any protection, such as clothes work, which is worsened by the lack of standard or procedure in work that can cause physical damage, such as burnt and peeled off skin. Moreover, exposure and contamination of chemical materials make the workers and their families vulnerable to various diseases. Meanwhile, batik entrepreneurs did not give any social security (including health cost aid). Besides that, the researchers found that batik industry in home industry sector is not environmentally friendly, even damaging ecosystem because industrial waste disposed without EIA.Keywords: exploitation, home batik industry, occupational health and safety, putting-out system
Procedia PDF Downloads 32426363 Applying Integrated QFD-MCDM Approach to Strengthen Supply Chain Agility for Mitigating Sustainable Risks
Authors: Enes Caliskan, Hatice Camgoz Akdag
Abstract:
There is no doubt that humanity needs to realize the sustainability problems in the world and take serious action regarding that. All members of the United Nations adopted the 2030 Agenda for Sustainable Development, the most comprehensive study on sustainability internationally, in 2015. The summary of the study is 17 sustainable development goals. It covers everything about sustainability, such as environment, society and governance. The use of Information and Communication Technology (ICT), such as the Internet, mobile phones, and satellites, is essential for tackling the main issues facing sustainable development. Hence, the contributions of 3 major ICT companies to the sustainable development goals are assessed in this study. Quality Function Deployment (QFD) is utilized as a methodology for this study. Since QFD is an excellent instrument for comparing businesses on relevant subjects, a House of Quality must be established to complete the QFD application. In order to develop a House of Quality, the demanded qualities (voice of the customer) and quality characteristics (technical requirements) must first be determined. UN SDGs are used as demanded qualities. Quality characteristics are derived from annual sustainability and corporate social responsibility reports of ICT companies. The companies' efforts, as indicated by the QFD results, are concentrated on the use of recycled raw materials and recycling, reducing GHG emissions through energy saving and improved connectivity, decarbonizing the value chain, protecting the environment and water resources by collaborating with businesses that have completed CDP water assessments and paying attention to reducing water consumption, ethical business practices, and reducing inequality. The evaluations of the three businesses are found to be very similar when they are compared. The small differences between the companies are usually about the region they serve. Efforts made by the companies mostly concentrate on responsible consumption and production, life below water, climate action, and sustainable cities and community goals. These efforts include improving connectivity in needed areas for providing access to information, education and healthcare.Keywords: multi-criteria decision-making, sustainable supply chain risk, supply chain agility, quality function deployment, Sustainable development goals
Procedia PDF Downloads 5326362 Design Thinking Activities: A Tool in Overcoming Student Reticence
Authors: Marinel Dayawon
Abstract:
Student participation in classroom activities is vital in the teaching- learning the process as it develops self-confidence, social relationships and good academic performance of students. It is the teacher’s empathetic manner and creativity to create solutions that encourage teamwork and mutual support while dropping the academic competition within the class that hinder every shy student to walk with courage and talk with conviction because they consider their ideas, weak, as compared to the bright students. This study aimed to explore the different design thinking strategies that will change the mindset of shy students in classroom activities, maximizing their participation in all given tasks while sharing their views through ideation and providing them a wider world through compromise agreement within the members of the group, sensitivity to one’s idea, thus, arriving at a collective decision in the development of a prototype that indicates improvement in their classroom involvement. The study used the qualitative type of research. Triangulation is done through participant observation, focus group discussion and interview, documented through photos and videos. The respondents were the second- year Bachelor of Secondary Education students of the Institute of Teacher Education at Isabela State University- Cauayan City Campus. The result of the study revealed that reticent students when involved in game activities through a slap and tap method, writing their clustered ideas, using sticky notes is excited in sharing ideas as it doesn’t use oral communication. It is also observed after three weeks of using the design thinking strategies; shy students volunteer as secretary, rapporteur or group leader in the team- building activities as it represents the ideas of the heterogeneous group, removing the individual identity of the ideas. Superior students learned to listen to the ideas of the reticent students and involved them in the prototyping process of designing a remediation program for high school students showing reticence in the classroom, making their experience as a benchmark. The strategies made a 360- degrees transformation of the shy students, producing their journal log, in their journey to being open. Thus, faculty members are now adopting the design thinking approach.Keywords: design thinking activities, qualitative, reticent students, Isabela, Philippines
Procedia PDF Downloads 22826361 Analysis of Delivery of Quad Play Services
Authors: Rahul Malhotra, Anurag Sharma
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: FTTH, quad play, play service, access networks, data rate
Procedia PDF Downloads 42026360 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 9326359 Optimum Design of Photovoltaic Water Pumping System Application
Authors: Sarah Abdourraziq, Rachid El Bachtiri
Abstract:
The solar power source for pumping water is one of the most promising areas in photovoltaic applications. The implementation of these systems allows to protect the environment and reduce the CO2 gas emission compared to systems trained by diesel generators. This paper presents a comparative study between the photovoltaic pumping system driven by DC motor, and AC motor to define the optimum design of this application. The studied system consists of PV array, DC-DC Boost Converter, inverter, motor-pump set and storage tank. The comparison was carried out to define the characteristics and the performance of each system. Each subsystem is modeled in order to simulate the whole system in MATLAB/ Simulink. The results show the efficiency of the proposed technique.Keywords: photovoltaic water pumping system, DC motor-pump, AC motor-pump, DC-DC boost converter
Procedia PDF Downloads 33226358 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization
Authors: Hironori Karachi, Haruka Yamashita
Abstract:
Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.Keywords: data science, non-negative matrix factorization, missing data, quality of services
Procedia PDF Downloads 134