Search results for: secure data aggregation
23520 Measuring Flood Risk concerning with the Flood Protection Embankment in Big Flooding Events of Dhaka Metropolitan Zone
Authors: Marju Ben Sayed, Shigeko Haruyama
Abstract:
Among all kinds of natural disaster, the flood is a common feature in rapidly urbanizing Dhaka city. In this research, assessment of flood risk of Dhaka metropolitan area has been investigated by using an integrated approach of GIS, remote sensing and socio-economic data. The purpose of the study is to measure the flooding risk concerning with the flood protection embankment in big flooding events (1988, 1998 and 2004) and urbanization of Dhaka metropolitan zone. In this research, we considered the Dhaka city into two parts; East Dhaka (outside the flood protection embankment) and West Dhaka (inside the flood protection embankment). Using statistical data, we explored the socio-economic status of the study area population by comparing the density of population, land price and income level. We have drawn the cross section profile of the flood protection embankment into three different points for realizing the flooding risk in the study area, especially in the big flooding year (1988, 1998 and 2004). According to the physical condition of the study area, the land use/land cover map has been classified into five classes. Comparing with each land cover unit, historical weather station data and the socio-economic data, the flooding risk has been evaluated. Moreover, we compared between DEM data and each land cover units to find out the relationship with flood. It is expected that, this study could contribute to effective flood forecasting, relief and emergency management for a future flood event in Dhaka city.Keywords: land use, land cover change, socio-economic, Dhaka city, GIS, flood
Procedia PDF Downloads 30023519 Iterative Method for Lung Tumor Localization in 4D CT
Authors: Sarah K. Hagi, Majdi Alnowaimi
Abstract:
In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.Keywords: automated algorithm , computed tomography, lung tumor, tumor localization
Procedia PDF Downloads 60723518 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression
Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras
Abstract:
In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression
Procedia PDF Downloads 12523517 Re-Constructing the Research Design: Dealing with Problems and Re-Establishing the Method in User-Centered Research
Authors: Kerem Rızvanoğlu, Serhat Güney, Emre Kızılkaya, Betül Aydoğan, Ayşegül Boyalı, Onurcan Güden
Abstract:
This study addresses the re-construction and implementation process of the methodological framework developed to evaluate how locative media applications accompany the urban experiences of international students coming to Istanbul with exchange programs in 2022. The research design was built on a three-stage model. The research team conducted a qualitative questionnaire in the first stage to gain exploratory data. These data were then used to form three persona groups representing the sample by applying cluster analysis. In the second phase, a semi-structured digital diary study was carried out on a gamified task list with a sample selected from the persona groups. This stage proved to be the most difficult to obtaining valid data from the participant group. The research team re-evaluated the design of this second phase to reach the participants who will perform the tasks given by the research team while sharing their momentary city experiences, to ensure the daily data flow for two weeks, and to increase the quality of the obtained data. The final stage, which follows to elaborate on the findings, is the “Walk & Talk,” which is completed with face-to-face and in-depth interviews. It has been seen that the multiple methods used in the research process contribute to the depth and data diversity of the research conducted in the context of urban experience and locative technologies. In addition, by adapting the research design to the experiences of the users included in the sample, the differences and similarities between the initial research design and the research applied are shown.Keywords: digital diary study, gamification, multi-model research, persona analysis, research design for urban experience, user-centered research, “Walk & Talk”
Procedia PDF Downloads 17523516 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe
Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis
Abstract:
The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM
Procedia PDF Downloads 43023515 Pantograph-Catenary Contact Force: Features Evaluation for Catenary Diagnostics
Authors: Mehdi Brahimi, Kamal Medjaher, Noureddine Zerhouni, Mohammed Leouatni
Abstract:
The Prognostics and Health Management is a system engineering discipline which provides solutions and models to the implantation of a predictive maintenance. The approach is based on extracting useful information from monitoring data to assess the “health” state of an industrial equipment or an asset. In this paper, we examine multiple extracted features from Pantograph-Catenary contact force in order to select the most relevant ones to achieve a diagnostics function. The feature extraction methodology is based on simulation data generated thanks to a Pantograph-Catenary simulation software called INPAC and measurement data. The feature extraction method is based on both statistical and signal processing analyses. The feature selection method is based on statistical criteria.Keywords: catenary/pantograph interaction, diagnostics, Prognostics and Health Management (PHM), quality of current collection
Procedia PDF Downloads 29423514 GRABTAXI: A Taxi Revolution in Thailand
Authors: Danuvasin Charoen
Abstract:
The study investigates the business process and business model of GRABTAXI. The paper also discusses how the company implemented strategies to gain competitive advantages. The data is derived from the analysis of secondary data and the in-depth interviews among staffs, taxi drivers, and key customers. The findings indicated that the company’s competitive advantages come from being the first mover, emphasising on the ease of use and tangible benefits of application, and using network effect strategy.Keywords: taxi, mobile application, innovative business model, Thailand
Procedia PDF Downloads 30123513 Developing a DNN Model for the Production of Biogas From a Hybrid BO-TPE System in an Anaerobic Wastewater Treatment Plant
Authors: Hadjer Sadoune, Liza Lamini, Scherazade Krim, Amel Djouadi, Rachida Rihani
Abstract:
Deep neural networks are highly regarded for their accuracy in predicting intricate fermentation processes. Their ability to learn from a large amount of datasets through artificial intelligence makes them particularly effective models. The primary obstacle in improving the performance of these models is to carefully choose the suitable hyperparameters, including the neural network architecture (number of hidden layers and hidden units), activation function, optimizer, learning rate, and other relevant factors. This study predicts biogas production from real wastewater treatment plant data using a sophisticated approach: hybrid Bayesian optimization with a tree-structured Parzen estimator (BO-TPE) for an optimised deep neural network (DNN) model. The plant utilizes an Upflow Anaerobic Sludge Blanket (UASB) digester that treats industrial wastewater from soft drinks and breweries. The digester has a working volume of 1574 m3 and a total volume of 1914 m3. Its internal diameter and height were 19 and 7.14 m, respectively. The data preprocessing was conducted with meticulous attention to preserving data quality while avoiding data reduction. Three normalization techniques were applied to the pre-processed data (MinMaxScaler, RobustScaler and StandardScaler) and compared with the Non-Normalized data. The RobustScaler approach has strong predictive ability for estimating the volume of biogas produced. The highest predicted biogas volume was 2236.105 Nm³/d, with coefficient of determination (R2), mean absolute error (MAE), and root mean square error (RMSE) values of 0.712, 164.610, and 223.429, respectively.Keywords: anaerobic digestion, biogas production, deep neural network, hybrid bo-tpe, hyperparameters tuning
Procedia PDF Downloads 4323512 Geographic Information System Application for Predicting Tourism Development in Gunungkidul Regency, Indonesia
Authors: Nindyo Cahyo Kresnanto, Muhamad Willdan, Wika Harisa Putri
Abstract:
Gunungkidul is one of the emerging tourism industry areas in Yogyakarta Province, Indonesia. This article describes how GIS can predict the development of tourism potential in Gunungkidul. The tourism sector in Gunungkidul Regency contributes 3.34% of the total gross regional domestic product and is the economic sector with the highest growth with a percentage of 18.37% in the post-Covid-19 period. This contribution makes researchers consider that several tourist sites need to be explored more to increase regional economic development gradually. This research starts by collecting spatial data from tourist locations tourists want to visit in Gunungkidul Regency based on survey data from 571 respondents. Then the data is visualized with ArcGIS software. This research shows an overview of tourist destinations interested in travellers depicted from the lowest to the highest from the data visualization. Based on the data visualization results, specific tourist locations potentially developed to influence the surrounding economy positively. The visualization of the data displayed is also in the form of a desire line map that shows tourist travel patterns from the origin of the tourist to the destination of the tourist location of interest. From the desire line, the prediction of the path of tourist sites with a high frequency of transportation activity can figure out. Predictions regarding specific tourist location routes that high transportation activities can burden can consider which routes will be chosen. The route also needs to be improved in terms of capacity and quality. The goal is to provide a sense of security and comfort for tourists who drive and positively impact the tourist sites traversed by the route.Keywords: tourism development, GIS and survey, transportation, potential desire line
Procedia PDF Downloads 7023511 Nepal Himalaya: Status of Women, Politics, and Administration
Authors: Tulasi Acharya
Abstract:
The paper is a qualitative analysis of status of women and women in politics and administration in Nepal Himalaya. The paper reviews data of women in civil service and in administrative levels. Looking at the Nepali politics and administration from the social constructivist perspective, the paper highlights some social and cultural issues that have othered women as “second sex.” As the country is heading towards modernity, gender friendly approaches are being instituted. Although the data reflects on the progress on women’s status and on women’s political and administrative participation, the data is not enough to predict the democratic gender practices in political and administrative levels. The political and administrative culture of Nepal Himalaya should be changed by promoting gender practices and deconstructing gender images in administrative culture through representative bureaucracy and by introducing democratic policies.Keywords: politics, policy, administration, culture, women, Nepal, democracy
Procedia PDF Downloads 54023510 Factors Associated with Acute Kidney Injury in Multiple Trauma Patients with Rhabdomyolysis
Authors: Yong Hwang, Kang Yeol Suh, Yundeok Jang, Tae Hoon Kim
Abstract:
Introduction: Rhabdomyolysis is a syndrome characterized by muscle necrosis and the release of intracellular muscle constituents into the circulation. Acute kidney injury is a potential complication of severe rhabdomyolysis and the prognosis is substantially worse if renal failure develops. We try to identify the factors that were predictive of AKI in severe trauma patients with rhabdomyolysis. Methods: This retrospective study was conducted at the emergency department of a level Ⅰ trauma center. Patients enrolled that initial creatine phosphokinase (CPK) levels were higher than 1000 IU with acute multiple trauma, and more than 18 years older from Oct. 2012 to June 2016. We collected demographic data (age, gender, length of hospital day, and patients’ outcome), laboratory data (ABGA, lactate, hemoglobin. hematocrit, platelet, LDH, myoglobin, liver enzyme, and BUN/Cr), and clinical data (Injury Mechanism, RTS, ISS, AIS, and TRISS). The data were compared and analyzed between AKI and Non-AKI group. Statistical analyses were performed using IMB SPSS 20.0 statistics for Window. Results: Three hundred sixty-four patients were enrolled that AKI group were ninety-six and non-AKI group were two hundred sixty-eight. The base excess (HCO3), AST/ALT, LDH, and myoglobin in AKI group were significantly higher than non-AKI group from laboratory data (p ≤ 0.05). The injury severity score (ISS), revised Trauma Score (RTS), Abbreviated Injury Scale 3 and 4 (AIS 3 and 4) were showed significant results in clinical data. The patterns of CPK level were increased from first and second day, but slightly decreased from third day in both group. Seven patients had received hemodialysis treatment despite the bleeding risk and were survived in AKI group. Conclusion: We recommend that HCO3, CPK, LDH, and myoglobin should be checked and be concerned about ISS, RTS, AIS with injury mechanism at the early stage of treatment in the emergency department.Keywords: acute kidney injury, emergencies, multiple trauma, rhabdomyolysis
Procedia PDF Downloads 34223509 Data Structure Learning Platform to Aid in Higher Education IT Courses (DSLEP)
Authors: Estevan B. Costa, Armando M. Toda, Marcell A. A. Mesquita, Jacques D. Brancher
Abstract:
The advances in technology in the last five years allowed an improvement in the educational area, as the increasing in the development of educational software. One of the techniques that emerged in this lapse is called Gamification, which is the utilization of video game mechanics outside its bounds. Recent studies involving this technique provided positive results in the application of these concepts in many areas as marketing, health and education. In the last area there are studies that cover from elementary to higher education, with many variations to adequate to the educators methodologies. Among higher education, focusing on IT courses, data structures are an important subject taught in many of these courses, as they are base for many systems. Based on the exposed this paper exposes the development of an interactive web learning environment, called DSLEP (Data Structure Learning Platform), to aid students in higher education IT courses. The system includes basic concepts seen on this subject such as stacks, queues, lists, arrays, trees and was implemented to ease the insertion of new structures. It was also implemented with gamification concepts, such as points, levels, and leader boards, to engage students in the search for knowledge and stimulate self-learning.Keywords: gamification, Interactive learning environment, data structures, e-learning
Procedia PDF Downloads 49923508 Hidden Markov Movement Modelling with Irregular Data
Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith
Abstract:
Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator
Procedia PDF Downloads 24923507 A Review of Lortie’s Schoolteacher
Authors: Tsai-Hsiu Lin
Abstract:
Dan C. Lortie’s Schoolteacher: A sociological study is one of the best works on the sociology of teaching since W. Waller’s classic study. It is a book worthy of review. Following the tradition of symbolic interactionists, Lortie demonstrated the qualities who studied the occupation of teaching. Using several methods to gather effective data, Lortie has portrayed the ethos of the teaching profession. Therefore, the work is an important book on the teaching profession and teacher culture. Though outstanding, Lortie’s work is also flawed in that his perspectives and methodology were adopted largely from symbolic interactionism. First, Lortie in his work analyzed many points regarding teacher culture; for example, he was interested in exploring “sentiment,” “cathexis,” and “ethos.” Thus, he was more a psychologist than a sociologist. Second, symbolic interactionism led him to discern the teacher culture from a micro view, thereby missing the structural aspects. For example, he did not fully discuss the issue of gender and he ignored the issue of race. Finally, following the qualitative sociological tradition, Lortie employed many qualitative methods to gather data but only foucused on obtaining and presenting interview data. Moreover, he used measurement methods that were too simplistic for analyzing quantitative data fully.Keywords: education reform, teacher culture, teaching profession, Lortie’s Schoolteacher
Procedia PDF Downloads 23423506 Urban Areas Management in Developing Countries: Analysis of the Urban Areas Crossed with Risk of Storm Water Drains, Aswan-Egypt
Authors: Omar Hamdy, Schichen Zhao, Hussein Abd El-Atty, Ayman Ragab, Muhammad Salem
Abstract:
One of the most risky areas in Aswan is Abouelreesh, which is suffering from flood disasters, as heavy deluge inundates urban areas causing considerable damage to buildings and infrastructure. Moreover, the main problem was the urban sprawl towards this risky area. This paper aims to identify the urban areas located in the risk areas prone to flash floods. Analyzing this phenomenon needs a lot of data to ensure satisfactory results; however, in this case the official data and field data were limited, and therefore, free sources of satellite data were used. This paper used ArcGIS tools to obtain the storm water drains network by analyzing DEM files. Additionally, historical imagery in Google Earth was studied to determine the age of each building. The last step was to overlay the urban area layer and the storm water drains layer to identify the vulnerable areas. The results of this study would be helpful to urban planners and government officials to make the disasters risk estimation and develop primary plans to recover the risky area, especially urban areas located in torrents.Keywords: risk area, DEM, storm water drains, GIS
Procedia PDF Downloads 46323505 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network
Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Sadiye Didem Boztepe Erkis, Ismail Saritas, Selma Tasdemir
Abstract:
Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modeled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the system developed, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.Keywords: artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.
Procedia PDF Downloads 38923504 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date
Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian
Abstract:
To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven
Procedia PDF Downloads 17623503 A Method to Estimate Wheat Yield Using Landsat Data
Authors: Zama Mahmood
Abstract:
The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation.Keywords: landsat, NDVI, remote sensing, satellite images, yield
Procedia PDF Downloads 33723502 Energy Security and Sustainable Development: Challenges and Prospects
Authors: Abhimanyu Behera
Abstract:
Over the past few years, energy security and sustainable development have moved rapidly into the global agenda. There are two main reasons: first, the impact of high and often volatile energy prices; second, concerns over environmental sustainability particularly about the global climate. Both issues are critically important in which impressive economic growth has boosted the demand for energy and put corresponding strains on the environment. Energy security is a broad concept that focuses on energy availability and pricing. Specifically, it refers to the ability of the energy supply system i.e. suppliers, transporters, distributors and regulatory, financial and R&D institutions to deliver the amount of competitively priced energy that customers demand, within accepted standards of reliability, timeliness, quality, safety. Traditionally, energy security has been defined in the context of the geopolitical risks to external oil supplies but today it is encompassing all energy forms, all the external and internal links bringing the energy to the final consumer, and all the many ways energy supplies can be disrupted including equipment malfunctions, system design flaws, operator errors, malicious computer activities, deficient market and regulatory frameworks, corporate financial problems, labour actions, severe weather and natural events, aggressive acts (e.g. war, terrorism and sabotage), and geopolitical disruptions. In practice, the most challenging disruptions are those linked to: 1) extreme weather events; 2) mismatched electricity supply and demand; 3) regulatory failures; and 4) concentration of oil and gas resources in certain regions of the world. However, insecure energy supplies inhibit development by raising energy costs and imposing expensive cuts in services when disruptions actually occur. The energy supply sector can best advance sustainable development by producing and delivering secure and environmentally-friendly sources of energy and by increasing the efficiency of energy use. With this objective, this paper seeks to highlight the significance of energy security and sustainable development in today’s world. Moreover, it critically overhauls the major challenges towards sustainability of energy security and what are the major policies are taken to overcome these challenges by Government is lucidly explicated in this paper.Keywords: energy, policies, security, sustainability
Procedia PDF Downloads 39123501 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 17623500 Potential of Detailed Environmental Data, Produced by Information and Communication Technology Tools, for Better Consideration of Microclimatology Issues in Urban Planning to Promote Active Mobility
Authors: Živa Ravnikar, Alfonso Bahillo Martinez, Barbara Goličnik Marušić
Abstract:
Climate change mitigation has been formally adopted and announced by countries over the globe, where cities are targeting carbon neutrality through various more or less successful, systematic, and fragmentary actions. The article is based on the fact that environmental conditions affect human comfort and the usage of space. Urban planning can, with its sustainable solutions, not only support climate mitigation in terms of a planet reduction of global warming but as well enabling natural processes that in the immediate vicinity produce environmental conditions that encourage people to walk or cycle. However, the article draws attention to the importance of integrating climate consideration into urban planning, where detailed environmental data play a key role, enabling urban planners to improve or monitor environmental conditions on cycle paths. In a practical aspect, this paper tests a particular ICT tool, a prototype used for environmental data. Data gathering was performed along the cycling lanes in Ljubljana (Slovenia), where the main objective was to assess the tool's data applicable value within the planning of comfortable cycling lanes. The results suggest that such transportable devices for in-situ measurements can help a researcher interpret detailed environmental information, characterized by fine granularity and precise data spatial and temporal resolution. Data can be interpreted within human comfort zones, where graphical representation is in the form of a map, enabling the link of the environmental conditions with a spatial context. The paper also provides preliminary results in terms of the potential of such tools for identifying the correlations between environmental conditions and different spatial settings, which can help urban planners to prioritize interventions in places. The paper contributes to multidisciplinary approaches as it demonstrates the usefulness of such fine-grained data for better consideration of microclimatology in urban planning, which is a prerequisite for creating climate-comfortable cycling lanes promoting active mobility.Keywords: information and communication technology tools, urban planning, human comfort, microclimate, cycling lanes
Procedia PDF Downloads 13823499 Image Ranking to Assist Object Labeling for Training Detection Models
Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman
Abstract:
Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.Keywords: computer vision, deep learning, object detection, semiconductor
Procedia PDF Downloads 14223498 From Text to Data: Sentiment Analysis of Presidential Election Political Forums
Authors: Sergio V Davalos, Alison L. Watkins
Abstract:
User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.Keywords: sentiment analysis, text mining, user generated content, US presidential elections
Procedia PDF Downloads 19523497 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility
Authors: Alejandro Villegas, Cihan Varol
Abstract:
Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise
Procedia PDF Downloads 30123496 Ion Thruster Grid Lifetime Assessment Based on Its Structural Failure
Authors: Juan Li, Jiawen Qiu, Yuchuan Chu, Tianping Zhang, Wei Meng, Yanhui Jia, Xiaohui Liu
Abstract:
This article developed an ion thruster optic system sputter erosion depth numerical 3D model by IFE-PIC (Immersed Finite Element-Particle-in-Cell) and Mont Carlo method, and calculated the downstream surface sputter erosion rate of accelerator grid; Compared with LIPS-200 life test data, the results of the numerical model are in reasonable agreement with the measured data. Finally, we predict the lifetime of the 20cm diameter ion thruster via the erosion data obtained with the model. The ultimate result demonstrates that under normal operating condition, the erosion rate of the grooves wears on the downstream surface of the accelerator grid is 34.6μm⁄1000h, which means the conservative lifetime until structural failure occurring on the accelerator grid is 11500 hours.Keywords: ion thruster, accelerator gird, sputter erosion, lifetime assessment
Procedia PDF Downloads 56923495 Identification and Understanding of Colloidal Destabilization Mechanisms in Geothermal Processes
Authors: Ines Raies, Eric Kohler, Marc Fleury, Béatrice Ledésert
Abstract:
In this work, the impact of clay minerals on the formation damage of sandstone reservoirs is studied to provide a better understanding of the problem of deep geothermal reservoir permeability reduction due to fine particle dispersion and migration. In some situations, despite the presence of filters in the geothermal loop at the surface, particles smaller than the filter size (<1 µm) may surprisingly generate significant permeability reduction affecting in the long term the overall performance of the geothermal system. Our study is carried out on cores from a Triassic reservoir in the Paris Basin (Feigneux, 60 km Northeast of Paris). Our goal is to first identify the clays responsible for clogging, a mineralogical characterization of these natural samples was carried out by coupling X-Ray Diffraction (XRD), Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Spectroscopy (EDS). The results show that the studied stratigraphic interval contains mostly illite and chlorite particles. Moreover, the spatial arrangement of the clays in the rocks as well as the morphology and size of the particles, suggest that illite is more easily mobilized than chlorite by the flow in the pore network. Thus, based on these results, illite particles were prepared and used in core flooding in order to better understand the factors leading to the aggregation and deposition of this type of clay particles in geothermal reservoirs under various physicochemical and hydrodynamic conditions. First, the stability of illite suspensions under geothermal conditions has been investigated using different characterization techniques, including Dynamic Light Scattering (DLS) and Scanning Transmission Electron Microscopy (STEM). Various parameters such as the hydrodynamic radius (around 100 nm), the morphology and surface area of aggregates were measured. Then, core-flooding experiments were carried out using sand columns to mimic the permeability decline due to the injection of illite-containing fluids in sandstone reservoirs. In particular, the effects of ionic strength, temperature, particle concentration and flow rate of the injected fluid were investigated. When the ionic strength increases, a permeability decline of more than a factor of 2 could be observed for pore velocities representative of in-situ conditions. Further details of the retention of particles in the columns were obtained from Magnetic Resonance Imaging and X-ray Tomography techniques, showing that the particle deposition is nonuniform along the column. It is clearly shown that very fine particles as small as 100 nm can generate significant permeability reduction under specific conditions in high permeability porous media representative of the Triassic reservoirs of the Paris basin. These retention mechanisms are explained in the general framework of the DLVO theoryKeywords: geothermal energy, reinjection, clays, colloids, retention, porosity, permeability decline, clogging, characterization, XRD, SEM-EDS, STEM, DLS, NMR, core flooding experiments
Procedia PDF Downloads 18023494 Nutrient Foramina of the Lunate Bone of the Hand – an Anatomical Study
Authors: P.J. Jiji, B.V. Murlimanju, Latha V. Prabhu, Mangala M. Pai
Abstract:
Background: The lunate bone dislocation can lead to the compression of the median nerve and subsequent carpal tunnel syndrome. The dislocation can interrupt the vasculature and would cause avascular necrosis. The objective of the present study was to study the morphology and number of the nutrient foramina in the cadaveric dried lunate bones of the Indian population. Methods: The present study included 28 lunate bones (13 right sided and 15 left sided) which were obtained from the gross anatomy laboratory of our institution. The bones were macroscopically observed for the nutrient foramina and the data was collected with respect to their number. The tabulation of the data and analysis were done. Results: All of our specimens (100%) exhibited the nutrient foramina over the non-articular surfaces. The foramina were observed only over the palmar and dorsal surfaces of the lunate bones. The foramen ranged between 2 and 10. The foramina were more in number over the dorsal surface (average number 3.3) in comparison to the palmar surface (average number 2.4). Conclusion: We believe that the present study has provided important data about the nutrient foramina of the lunate bones. The data is enlightening to the orthopedic surgeon and would help in the hand surgeries. The morphological knowledge of the vasculature, their foramina of entry and their number is required to understand the concepts in the lunatomalacia and Kienbock’s disease.Keywords: avascular necrosis, foramen, lunate, nutrient
Procedia PDF Downloads 24823493 Big Data Applications for the Transport Sector
Authors: Antonella Falanga, Armando Cartenì
Abstract:
Today, an unprecedented amount of data coming from several sources, including mobile devices, sensors, tracking systems, and online platforms, characterizes our lives. The term “big data” not only refers to the quantity of data but also to the variety and speed of data generation. These data hold valuable insights that, when extracted and analyzed, facilitate informed decision-making. The 4Vs of big data - velocity, volume, variety, and value - highlight essential aspects, showcasing the rapid generation, vast quantities, diverse sources, and potential value addition of these kinds of data. This surge of information has revolutionized many sectors, such as business for improving decision-making processes, healthcare for clinical record analysis and medical research, education for enhancing teaching methodologies, agriculture for optimizing crop management, finance for risk assessment and fraud detection, media and entertainment for personalized content recommendations, emergency for a real-time response during crisis/events, and also mobility for the urban planning and for the design/management of public and private transport services. Big data's pervasive impact enhances societal aspects, elevating the quality of life, service efficiency, and problem-solving capacities. However, during this transformative era, new challenges arise, including data quality, privacy, data security, cybersecurity, interoperability, the need for advanced infrastructures, and staff training. Within the transportation sector (the one investigated in this research), applications span planning, designing, and managing systems and mobility services. Among the most common big data applications within the transport sector are, for example, real-time traffic monitoring, bus/freight vehicle route optimization, vehicle maintenance, road safety and all the autonomous and connected vehicles applications. Benefits include a reduction in travel times, road accidents and pollutant emissions. Within these issues, the proper transport demand estimation is crucial for sustainable transportation planning. Evaluating the impact of sustainable mobility policies starts with a quantitative analysis of travel demand. Achieving transportation decarbonization goals hinges on precise estimations of demand for individual transport modes. Emerging technologies, offering substantial big data at lower costs than traditional methods, play a pivotal role in this context. Starting from these considerations, this study explores the usefulness impact of big data within transport demand estimation. This research focuses on leveraging (big) data collected during the COVID-19 pandemic to estimate the evolution of the mobility demand in Italy. Estimation results reveal in the post-COVID-19 era, more than 96 million national daily trips, about 2.6 trips per capita, with a mobile population of more than 37.6 million Italian travelers per day. Overall, this research allows us to conclude that big data better enhances rational decision-making for mobility demand estimation, which is imperative for adeptly planning and allocating investments in transportation infrastructures and services.Keywords: big data, cloud computing, decision-making, mobility demand, transportation
Procedia PDF Downloads 6623492 Occurrence and Levels of Mycotoxins in On-Farm Stored Sesame in Major-Growing Districts of Ethiopia
Authors: S. Alemayehu, F. A. Abera, K. M. Ayimut, R. Mahroof, J. Harvey, B. Subramanyam
Abstract:
The occurrence of mycotoxins in sesame seeds poses a significant threat to food safety and the economy in Ethiopia. This study aimed to determine the levels and occurrence of mycotoxins in on-farm stored sesame seeds in major-growing districts of Ethiopia. A total of 470 sesame seed samples were collected from randomly selected farmers' storage structures in five major-growing districts using purposive sampling techniques. An enzyme-linked immunosorbent assay (ELISA) was used to analyze the collected samples for the presence of four mycotoxins: total aflatoxins (AFT), ochratoxin A (OTA), total fumonisins (FUM), and deoxynivalenol (DON). The study found that all samples contained varying levels of mycotoxins, with AFT and DON being the most prevalent. AFT concentrations in detected samples ranged from 2.5 to 27.8 parts per billion (ppb), with a mean concentration of 13.8 ppb. OTA levels ranged from 5.0 ppb to 9.7 ppb, with a mean level of 7.1 ppb. Total fumonisin concentrations ranged from 300 to 1300 ppb in all samples, with a mean of 800 ppb. DON concentrations ranged from 560 to 700 ppb in the analyzed samples. The majority (96.8%) of the samples were safe from AFT, FUM, and DON mean levels when compared to the Federal Drug Administration maximum limit. AFT-OTA, DON-OTA, AFT-FUM, FUM-DON, and FUM-OTA, respectively, had co-occurrence rates of 44.0, 38.3, 33.8, 30.2, 29.8 and 26.0% for mycotoxins. On average, 37.2% of the sesame samples had fungal infection, and seed germination rates ranged from 66.8% to 91.1%. The Limmu district had higher levels of total aflatoxins, kernel infection, and lower germination rates than other districts. The Wollega variety of sesame had higher kernel infection, total aflatoxins concentration, and lower germination rates than other varieties. Grain age had a statistically significant (p<0.05) effect on both kernel infection and germination. The storage methods used for sesame in major-growing districts of Ethiopia favor mycotoxin-producing fungi. As the levels of mycotoxins in sesame are of public health significance, stakeholders should come together to identify secure and suitable storage technologies to maintain the quantity and quality of sesame at the level of smallholder farmers. This study suggests the need for suitable storage technologies to maintain the quality of sesame and reduce the risk of mycotoxin contamination.Keywords: districts, seed germination, kernel infection, moisture content, relative humidity, temperature
Procedia PDF Downloads 13923491 ISME: Integrated Style Motion Editor for 3D Humanoid Character
Authors: Ismahafezi Ismail, Mohd Shahrizal Sunar
Abstract:
The motion of a realistic 3D humanoid character is very important especially for the industries developing computer animations and games. However, this type of motion is seen with a very complex dimensional data as well as body position, orientation, and joint rotation. Integrated Style Motion Editor (ISME), on the other hand, is a method used to alter the 3D humanoid motion capture data utilised in computer animation and games development. Therefore, this study was carried out with the purpose of demonstrating a method that is able to manipulate and deform different motion styles by integrating Key Pose Deformation Technique and Trajectory Control Technique. This motion editing method allows the user to generate new motions from the original motion capture data using a simple interface control. Unlike the previous method, our method produces a realistic humanoid motion style in real time.Keywords: computer animation, humanoid motion, motion capture, motion editing
Procedia PDF Downloads 384