Search results for: data mart
23614 Factors Associated with Acute Kidney Injury in Multiple Trauma Patients with Rhabdomyolysis
Authors: Yong Hwang, Kang Yeol Suh, Yundeok Jang, Tae Hoon Kim
Abstract:
Introduction: Rhabdomyolysis is a syndrome characterized by muscle necrosis and the release of intracellular muscle constituents into the circulation. Acute kidney injury is a potential complication of severe rhabdomyolysis and the prognosis is substantially worse if renal failure develops. We try to identify the factors that were predictive of AKI in severe trauma patients with rhabdomyolysis. Methods: This retrospective study was conducted at the emergency department of a level Ⅰ trauma center. Patients enrolled that initial creatine phosphokinase (CPK) levels were higher than 1000 IU with acute multiple trauma, and more than 18 years older from Oct. 2012 to June 2016. We collected demographic data (age, gender, length of hospital day, and patients’ outcome), laboratory data (ABGA, lactate, hemoglobin. hematocrit, platelet, LDH, myoglobin, liver enzyme, and BUN/Cr), and clinical data (Injury Mechanism, RTS, ISS, AIS, and TRISS). The data were compared and analyzed between AKI and Non-AKI group. Statistical analyses were performed using IMB SPSS 20.0 statistics for Window. Results: Three hundred sixty-four patients were enrolled that AKI group were ninety-six and non-AKI group were two hundred sixty-eight. The base excess (HCO3), AST/ALT, LDH, and myoglobin in AKI group were significantly higher than non-AKI group from laboratory data (p ≤ 0.05). The injury severity score (ISS), revised Trauma Score (RTS), Abbreviated Injury Scale 3 and 4 (AIS 3 and 4) were showed significant results in clinical data. The patterns of CPK level were increased from first and second day, but slightly decreased from third day in both group. Seven patients had received hemodialysis treatment despite the bleeding risk and were survived in AKI group. Conclusion: We recommend that HCO3, CPK, LDH, and myoglobin should be checked and be concerned about ISS, RTS, AIS with injury mechanism at the early stage of treatment in the emergency department.Keywords: acute kidney injury, emergencies, multiple trauma, rhabdomyolysis
Procedia PDF Downloads 33923613 Data Structure Learning Platform to Aid in Higher Education IT Courses (DSLEP)
Authors: Estevan B. Costa, Armando M. Toda, Marcell A. A. Mesquita, Jacques D. Brancher
Abstract:
The advances in technology in the last five years allowed an improvement in the educational area, as the increasing in the development of educational software. One of the techniques that emerged in this lapse is called Gamification, which is the utilization of video game mechanics outside its bounds. Recent studies involving this technique provided positive results in the application of these concepts in many areas as marketing, health and education. In the last area there are studies that cover from elementary to higher education, with many variations to adequate to the educators methodologies. Among higher education, focusing on IT courses, data structures are an important subject taught in many of these courses, as they are base for many systems. Based on the exposed this paper exposes the development of an interactive web learning environment, called DSLEP (Data Structure Learning Platform), to aid students in higher education IT courses. The system includes basic concepts seen on this subject such as stacks, queues, lists, arrays, trees and was implemented to ease the insertion of new structures. It was also implemented with gamification concepts, such as points, levels, and leader boards, to engage students in the search for knowledge and stimulate self-learning.Keywords: gamification, Interactive learning environment, data structures, e-learning
Procedia PDF Downloads 49523612 Hidden Markov Movement Modelling with Irregular Data
Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith
Abstract:
Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator
Procedia PDF Downloads 24423611 A Review of Lortie’s Schoolteacher
Authors: Tsai-Hsiu Lin
Abstract:
Dan C. Lortie’s Schoolteacher: A sociological study is one of the best works on the sociology of teaching since W. Waller’s classic study. It is a book worthy of review. Following the tradition of symbolic interactionists, Lortie demonstrated the qualities who studied the occupation of teaching. Using several methods to gather effective data, Lortie has portrayed the ethos of the teaching profession. Therefore, the work is an important book on the teaching profession and teacher culture. Though outstanding, Lortie’s work is also flawed in that his perspectives and methodology were adopted largely from symbolic interactionism. First, Lortie in his work analyzed many points regarding teacher culture; for example, he was interested in exploring “sentiment,” “cathexis,” and “ethos.” Thus, he was more a psychologist than a sociologist. Second, symbolic interactionism led him to discern the teacher culture from a micro view, thereby missing the structural aspects. For example, he did not fully discuss the issue of gender and he ignored the issue of race. Finally, following the qualitative sociological tradition, Lortie employed many qualitative methods to gather data but only foucused on obtaining and presenting interview data. Moreover, he used measurement methods that were too simplistic for analyzing quantitative data fully.Keywords: education reform, teacher culture, teaching profession, Lortie’s Schoolteacher
Procedia PDF Downloads 22923610 Urban Areas Management in Developing Countries: Analysis of the Urban Areas Crossed with Risk of Storm Water Drains, Aswan-Egypt
Authors: Omar Hamdy, Schichen Zhao, Hussein Abd El-Atty, Ayman Ragab, Muhammad Salem
Abstract:
One of the most risky areas in Aswan is Abouelreesh, which is suffering from flood disasters, as heavy deluge inundates urban areas causing considerable damage to buildings and infrastructure. Moreover, the main problem was the urban sprawl towards this risky area. This paper aims to identify the urban areas located in the risk areas prone to flash floods. Analyzing this phenomenon needs a lot of data to ensure satisfactory results; however, in this case the official data and field data were limited, and therefore, free sources of satellite data were used. This paper used ArcGIS tools to obtain the storm water drains network by analyzing DEM files. Additionally, historical imagery in Google Earth was studied to determine the age of each building. The last step was to overlay the urban area layer and the storm water drains layer to identify the vulnerable areas. The results of this study would be helpful to urban planners and government officials to make the disasters risk estimation and develop primary plans to recover the risky area, especially urban areas located in torrents.Keywords: risk area, DEM, storm water drains, GIS
Procedia PDF Downloads 45923609 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network
Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Sadiye Didem Boztepe Erkis, Ismail Saritas, Selma Tasdemir
Abstract:
Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modeled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the system developed, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.Keywords: artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.
Procedia PDF Downloads 38523608 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date
Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian
Abstract:
To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven
Procedia PDF Downloads 17423607 A Method to Estimate Wheat Yield Using Landsat Data
Authors: Zama Mahmood
Abstract:
The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation.Keywords: landsat, NDVI, remote sensing, satellite images, yield
Procedia PDF Downloads 33523606 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile
Procedia PDF Downloads 16923605 Potential of Detailed Environmental Data, Produced by Information and Communication Technology Tools, for Better Consideration of Microclimatology Issues in Urban Planning to Promote Active Mobility
Authors: Živa Ravnikar, Alfonso Bahillo Martinez, Barbara Goličnik Marušić
Abstract:
Climate change mitigation has been formally adopted and announced by countries over the globe, where cities are targeting carbon neutrality through various more or less successful, systematic, and fragmentary actions. The article is based on the fact that environmental conditions affect human comfort and the usage of space. Urban planning can, with its sustainable solutions, not only support climate mitigation in terms of a planet reduction of global warming but as well enabling natural processes that in the immediate vicinity produce environmental conditions that encourage people to walk or cycle. However, the article draws attention to the importance of integrating climate consideration into urban planning, where detailed environmental data play a key role, enabling urban planners to improve or monitor environmental conditions on cycle paths. In a practical aspect, this paper tests a particular ICT tool, a prototype used for environmental data. Data gathering was performed along the cycling lanes in Ljubljana (Slovenia), where the main objective was to assess the tool's data applicable value within the planning of comfortable cycling lanes. The results suggest that such transportable devices for in-situ measurements can help a researcher interpret detailed environmental information, characterized by fine granularity and precise data spatial and temporal resolution. Data can be interpreted within human comfort zones, where graphical representation is in the form of a map, enabling the link of the environmental conditions with a spatial context. The paper also provides preliminary results in terms of the potential of such tools for identifying the correlations between environmental conditions and different spatial settings, which can help urban planners to prioritize interventions in places. The paper contributes to multidisciplinary approaches as it demonstrates the usefulness of such fine-grained data for better consideration of microclimatology in urban planning, which is a prerequisite for creating climate-comfortable cycling lanes promoting active mobility.Keywords: information and communication technology tools, urban planning, human comfort, microclimate, cycling lanes
Procedia PDF Downloads 13423604 Image Ranking to Assist Object Labeling for Training Detection Models
Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman
Abstract:
Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.Keywords: computer vision, deep learning, object detection, semiconductor
Procedia PDF Downloads 13623603 From Text to Data: Sentiment Analysis of Presidential Election Political Forums
Authors: Sergio V Davalos, Alison L. Watkins
Abstract:
User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.Keywords: sentiment analysis, text mining, user generated content, US presidential elections
Procedia PDF Downloads 19223602 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility
Authors: Alejandro Villegas, Cihan Varol
Abstract:
Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise
Procedia PDF Downloads 29723601 Ion Thruster Grid Lifetime Assessment Based on Its Structural Failure
Authors: Juan Li, Jiawen Qiu, Yuchuan Chu, Tianping Zhang, Wei Meng, Yanhui Jia, Xiaohui Liu
Abstract:
This article developed an ion thruster optic system sputter erosion depth numerical 3D model by IFE-PIC (Immersed Finite Element-Particle-in-Cell) and Mont Carlo method, and calculated the downstream surface sputter erosion rate of accelerator grid; Compared with LIPS-200 life test data, the results of the numerical model are in reasonable agreement with the measured data. Finally, we predict the lifetime of the 20cm diameter ion thruster via the erosion data obtained with the model. The ultimate result demonstrates that under normal operating condition, the erosion rate of the grooves wears on the downstream surface of the accelerator grid is 34.6μm⁄1000h, which means the conservative lifetime until structural failure occurring on the accelerator grid is 11500 hours.Keywords: ion thruster, accelerator gird, sputter erosion, lifetime assessment
Procedia PDF Downloads 56523600 Nutrient Foramina of the Lunate Bone of the Hand – an Anatomical Study
Authors: P.J. Jiji, B.V. Murlimanju, Latha V. Prabhu, Mangala M. Pai
Abstract:
Background: The lunate bone dislocation can lead to the compression of the median nerve and subsequent carpal tunnel syndrome. The dislocation can interrupt the vasculature and would cause avascular necrosis. The objective of the present study was to study the morphology and number of the nutrient foramina in the cadaveric dried lunate bones of the Indian population. Methods: The present study included 28 lunate bones (13 right sided and 15 left sided) which were obtained from the gross anatomy laboratory of our institution. The bones were macroscopically observed for the nutrient foramina and the data was collected with respect to their number. The tabulation of the data and analysis were done. Results: All of our specimens (100%) exhibited the nutrient foramina over the non-articular surfaces. The foramina were observed only over the palmar and dorsal surfaces of the lunate bones. The foramen ranged between 2 and 10. The foramina were more in number over the dorsal surface (average number 3.3) in comparison to the palmar surface (average number 2.4). Conclusion: We believe that the present study has provided important data about the nutrient foramina of the lunate bones. The data is enlightening to the orthopedic surgeon and would help in the hand surgeries. The morphological knowledge of the vasculature, their foramina of entry and their number is required to understand the concepts in the lunatomalacia and Kienbock’s disease.Keywords: avascular necrosis, foramen, lunate, nutrient
Procedia PDF Downloads 24423599 Big Data Applications for the Transport Sector
Authors: Antonella Falanga, Armando Cartenì
Abstract:
Today, an unprecedented amount of data coming from several sources, including mobile devices, sensors, tracking systems, and online platforms, characterizes our lives. The term “big data” not only refers to the quantity of data but also to the variety and speed of data generation. These data hold valuable insights that, when extracted and analyzed, facilitate informed decision-making. The 4Vs of big data - velocity, volume, variety, and value - highlight essential aspects, showcasing the rapid generation, vast quantities, diverse sources, and potential value addition of these kinds of data. This surge of information has revolutionized many sectors, such as business for improving decision-making processes, healthcare for clinical record analysis and medical research, education for enhancing teaching methodologies, agriculture for optimizing crop management, finance for risk assessment and fraud detection, media and entertainment for personalized content recommendations, emergency for a real-time response during crisis/events, and also mobility for the urban planning and for the design/management of public and private transport services. Big data's pervasive impact enhances societal aspects, elevating the quality of life, service efficiency, and problem-solving capacities. However, during this transformative era, new challenges arise, including data quality, privacy, data security, cybersecurity, interoperability, the need for advanced infrastructures, and staff training. Within the transportation sector (the one investigated in this research), applications span planning, designing, and managing systems and mobility services. Among the most common big data applications within the transport sector are, for example, real-time traffic monitoring, bus/freight vehicle route optimization, vehicle maintenance, road safety and all the autonomous and connected vehicles applications. Benefits include a reduction in travel times, road accidents and pollutant emissions. Within these issues, the proper transport demand estimation is crucial for sustainable transportation planning. Evaluating the impact of sustainable mobility policies starts with a quantitative analysis of travel demand. Achieving transportation decarbonization goals hinges on precise estimations of demand for individual transport modes. Emerging technologies, offering substantial big data at lower costs than traditional methods, play a pivotal role in this context. Starting from these considerations, this study explores the usefulness impact of big data within transport demand estimation. This research focuses on leveraging (big) data collected during the COVID-19 pandemic to estimate the evolution of the mobility demand in Italy. Estimation results reveal in the post-COVID-19 era, more than 96 million national daily trips, about 2.6 trips per capita, with a mobile population of more than 37.6 million Italian travelers per day. Overall, this research allows us to conclude that big data better enhances rational decision-making for mobility demand estimation, which is imperative for adeptly planning and allocating investments in transportation infrastructures and services.Keywords: big data, cloud computing, decision-making, mobility demand, transportation
Procedia PDF Downloads 6323598 ISME: Integrated Style Motion Editor for 3D Humanoid Character
Authors: Ismahafezi Ismail, Mohd Shahrizal Sunar
Abstract:
The motion of a realistic 3D humanoid character is very important especially for the industries developing computer animations and games. However, this type of motion is seen with a very complex dimensional data as well as body position, orientation, and joint rotation. Integrated Style Motion Editor (ISME), on the other hand, is a method used to alter the 3D humanoid motion capture data utilised in computer animation and games development. Therefore, this study was carried out with the purpose of demonstrating a method that is able to manipulate and deform different motion styles by integrating Key Pose Deformation Technique and Trajectory Control Technique. This motion editing method allows the user to generate new motions from the original motion capture data using a simple interface control. Unlike the previous method, our method produces a realistic humanoid motion style in real time.Keywords: computer animation, humanoid motion, motion capture, motion editing
Procedia PDF Downloads 38223597 Effect of Traffic Volume and Its Composition on Vehicular Speed under Mixed Traffic Conditions: A Kriging Based Approach
Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh
Abstract:
Use of speed prediction models sometimes appears as a feasible alternative to laborious field measurement particularly, in case when field data cannot fulfill designer’s requirements. However, developing speed models is a challenging task specifically in the context of developing countries like India where vehicles with diverse static and dynamic characteristics use the same right of way without any segregation. Here the traffic composition plays a significant role in determining the vehicular speed. The present research was carried out to examine the effects of traffic volume and its composition on vehicular speed under mixed traffic conditions. Classified traffic volume and speed data were collected from different geometrically identical six lane divided arterials in New Delhi. Based on these field data, speed prediction models were developed for individual vehicle category adopting Kriging approximation technique, an alternative for commonly used regression. These models are validated with the data set kept aside earlier for validation purpose. The predicted speeds showed a great deal of agreement with the observed values and also the model outperforms all other existing speed models. Finally, the proposed models were utilized to evaluate the effect of traffic volume and its composition on speed.Keywords: speed, Kriging, arterial, traffic volume
Procedia PDF Downloads 35323596 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO
Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu
Abstract:
Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO
Procedia PDF Downloads 9123595 Impact of Transitioning to Renewable Energy Sources on Key Performance Indicators and Artificial Intelligence Modules of Data Center
Authors: Ahmed Hossam ElMolla, Mohamed Hatem Saleh, Hamza Mostafa, Lara Mamdouh, Yassin Wael
Abstract:
Artificial intelligence (AI) is reshaping industries, and its potential to revolutionize renewable energy and data center operations is immense. By harnessing AI's capabilities, we can optimize energy consumption, predict fluctuations in renewable energy generation, and improve the efficiency of data center infrastructure. This convergence of technologies promises a future where energy is managed more intelligently, sustainably, and cost-effectively. The integration of AI into renewable energy systems unlocks a wealth of opportunities. Machine learning algorithms can analyze vast amounts of data to forecast weather patterns, solar irradiance, and wind speeds, enabling more accurate energy production planning. AI-powered systems can optimize energy storage and grid management, ensuring a stable power supply even during intermittent renewable generation. Moreover, AI can identify maintenance needs for renewable energy infrastructure, preventing costly breakdowns and maximizing system lifespan. Data centers, which consume substantial amounts of energy, are prime candidates for AI-driven optimization. AI can analyze energy consumption patterns, identify inefficiencies, and recommend adjustments to cooling systems, server utilization, and power distribution. Predictive maintenance using AI can prevent equipment failures, reducing energy waste and downtime. Additionally, AI can optimize data placement and retrieval, minimizing energy consumption associated with data transfer. As AI transforms renewable energy and data center operations, modified Key Performance Indicators (KPIs) will emerge. Traditional metrics like energy efficiency and cost-per-megawatt-hour will continue to be relevant, but additional KPIs focused on AI's impact will be essential. These might include AI-driven cost savings, predictive accuracy of energy generation and consumption, and the reduction of carbon emissions attributed to AI-optimized operations. By tracking these KPIs, organizations can measure the success of their AI initiatives and identify areas for improvement. Ultimately, the synergy between AI, renewable energy, and data centers holds the potential to create a more sustainable and resilient future. By embracing these technologies, we can build smarter, greener, and more efficient systems that benefit both the environment and the economy.Keywords: data center, artificial intelligence, renewable energy, energy efficiency, sustainability, optimization, predictive analytics, energy consumption, energy storage, grid management, data center optimization, key performance indicators, carbon emissions, resiliency
Procedia PDF Downloads 3523594 dynr.mi: An R Program for Multiple Imputation in Dynamic Modeling
Authors: Yanling Li, Linying Ji, Zita Oravecz, Timothy R. Brick, Michael D. Hunter, Sy-Miin Chow
Abstract:
Assessing several individuals intensively over time yields intensive longitudinal data (ILD). Even though ILD provide rich information, they also bring other data analytic challenges. One of these is the increased occurrence of missingness with increased study length, possibly under non-ignorable missingness scenarios. Multiple imputation (MI) handles missing data by creating several imputed data sets, and pooling the estimation results across imputed data sets to yield final estimates for inferential purposes. In this article, we introduce dynr.mi(), a function in the R package, Dynamic Modeling in R (dynr). The package dynr provides a suite of fast and accessible functions for estimating and visualizing the results from fitting linear and nonlinear dynamic systems models in discrete as well as continuous time. By integrating the estimation functions in dynr and the MI procedures available from the R package, Multivariate Imputation by Chained Equations (MICE), the dynr.mi() routine is designed to handle possibly non-ignorable missingness in the dependent variables and/or covariates in a user-specified dynamic systems model via MI, with convergence diagnostic check. We utilized dynr.mi() to examine, in the context of a vector autoregressive model, the relationships among individuals’ ambulatory physiological measures, and self-report affect valence and arousal. The results from MI were compared to those from listwise deletion of entries with missingness in the covariates. When we determined the number of iterations based on the convergence diagnostics available from dynr.mi(), differences in the statistical significance of the covariate parameters were observed between the listwise deletion and MI approaches. These results underscore the importance of considering diagnostic information in the implementation of MI procedures.Keywords: dynamic modeling, missing data, mobility, multiple imputation
Procedia PDF Downloads 16423593 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement
Authors: Hadi Ardiny, Amir Mohammad Beigzadeh
Abstract:
Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems
Procedia PDF Downloads 12423592 Annual Water Level Simulation Using Support Vector Machine
Authors: Maryam Khalilzadeh Poshtegal, Seyed Ahmad Mirbagheri, Mojtaba Noury
Abstract:
In this paper, by application of the input yearly data of rainfall, temperature and flow to the Urmia Lake, the simulation of water level fluctuation were applied by means of three models. According to the climate change investigation the fluctuation of lakes water level are of high interest. This study investigate data-driven models, support vector machines (SVM), SVM method which is a new regression procedure in water resources are applied to the yearly level data of Lake Urmia that is the biggest and the hyper saline lake in Iran. The evaluated lake levels are found to be in good correlation with the observed values. The results of SVM simulation show better accuracy and implementation. The mean square errors, mean absolute relative errors and determination coefficient statistics are used as comparison criteria.Keywords: simulation, water level fluctuation, urmia lake, support vector machine
Procedia PDF Downloads 36723591 Dynamic Mode Decomposition and Wake Flow Modelling of a Wind Turbine
Authors: Nor Mazlin Zahari, Lian Gan, Xuerui Mao
Abstract:
The power production in wind farms and the mechanical loads on the turbines are strongly impacted by the wake of the wind turbine. Thus, there is a need for understanding and modelling the turbine wake dynamic in the wind farm and the layout optimization. Having a good wake model is important in predicting plant performance and understanding fatigue loads. In this paper, the Dynamic Mode Decomposition (DMD) was applied to the simulation data generated by a Direct Numerical Simulation (DNS) of flow around a turbine, perturbed by upstream inflow noise. This technique is useful in analyzing the wake flow, to predict its future states and to reflect flow dynamics associated with the coherent structures behind wind turbine wake flow. DMD was employed to describe the dynamic of the flow around turbine from the DNS data. Since the DNS data comes with the unstructured meshes and non-uniform grid, the interpolation of each occurring within each element in the data to obtain an evenly spaced mesh was performed before the DMD was applied. DMD analyses were able to tell us characteristics of the travelling waves behind the turbine, e.g. the dominant helical flow structures and the corresponding frequencies. As the result, the dominant frequency will be detected, and the associated spatial structure will be identified. The dynamic mode which represented the coherent structure will be presented.Keywords: coherent structure, Direct Numerical Simulation (DNS), dominant frequency, Dynamic Mode Decomposition (DMD)
Procedia PDF Downloads 34823590 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing
Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais
Abstract:
Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query
Procedia PDF Downloads 20323589 Assessing the Impact of Climate Change on Pulses Production in Khyber Pakhtunkhwa, Pakistan
Authors: Khuram Nawaz Sadozai, Rizwan Ahmad, Munawar Raza Kazmi, Awais Habib
Abstract:
Climate change and crop production are intrinsically associated with each other. Therefore, this research study is designed to assess the impact of climate change on pulses production in Southern districts of Khyber Pakhtunkhwa (KP) Province of Pakistan. Two pulses (i.e. chickpea and mung bean) were selected for this research study with respect to climate change. Climatic variables such as temperature, humidity and precipitation along with pulses production and area under cultivation of pulses were encompassed as the major variables of this study. Secondary data of climatic variables and crop variables for the period of thirty four years (1986-2020) were obtained from Pakistan Metrological Department and Agriculture Statistics of KP respectively. Panel data set of chickpea and mung bean crops was estimated separately. The analysis validate that both data sets were a balanced panel data. The Hausman specification test was run separately for both the panel data sets whose findings had suggested the fixed effect model can be deemed as an appropriate model for chickpea panel data, however random effect model was appropriate for estimation of the panel data of mung bean. Major findings confirm that maximum temperature is statistically significant for the chickpea yield. This implies if maximum temperature increases by 1 0C, it can enhance the chickpea yield by 0.0463 units. However, the impact of precipitation was reported insignificant. Furthermore, the humidity was statistically significant and has a positive association with chickpea yield. In case of mung bean the minimum temperature was significantly contributing in the yield of mung bean. This study concludes that temperature and humidity can significantly contribute to enhance the pulses yield. It is recommended that capacity building of pulses growers may be made to adapt the climate change strategies. Moreover, government may ensure the availability of climate change resistant varieties of pulses to encourage the pulses cultivation.Keywords: climate change, pulses productivity, agriculture, Pakistan
Procedia PDF Downloads 4423588 Study on Security and Privacy Issues of Mobile Operating Systems Based on Malware Attacks
Authors: Huang Dennis, Aurelio Aziel, Burra Venkata Durga Kumar
Abstract:
Nowadays, smartphones and mobile operating systems have been popularly widespread in our daily lives. As people use smartphones, they tend to store more private and essential data on their devices, because of this it is very important to develop more secure mobile operating systems and cloud storage to secure the data. However, several factors can cause security risks in mobile operating systems such as malware, malicious app, phishing attacks, ransomware, and more, all of which can cause a big problem for users as they can access the user's private data. Those problems can cause data loss, financial loss, identity theft, and other serious consequences. Other than that, during the pandemic, people will use their mobile devices more and do all sorts of transactions online, which may lead to more victims of online scams and inexperienced users being the target. With the increase in attacks, researchers have been actively working to develop several countermeasures to enhance the security of operating systems. This study aims to provide an overview of the security and privacy issues in mobile operating systems, identifying the potential risk of operating systems, and the possible solutions. By examining these issues, we want to provide an easy understanding to users and researchers to improve knowledge and develop more secure mobile operating systems.Keywords: mobile operating system, security, privacy, Malware
Procedia PDF Downloads 8923587 Geothermal Energy Evaluation of Lower Benue Trough Using Spectral Analysis of Aeromagnetic Data
Authors: Stella C. Okenu, Stephen O. Adikwu, Martins E. Okoro
Abstract:
The geothermal energy resource potential of the Lower Benue Trough (LBT) in Nigeria was evaluated in this study using spectral analysis of high-resolution aeromagnetic (HRAM) data. The reduced to the equator aeromagnetic data was divided into sixteen (16) overlapping blocks, and each of the blocks was analyzed to obtain the radial averaged power spectrum which enabled the computation of the top and centroid depths to magnetic sources. The values were then used to assess the Curie Point Depth (CPD), geothermal gradients, and heat flow variations in the study area. Results showed that CPD varies from 7.03 to 18.23 km, with an average of 12.26 km; geothermal gradient values vary between 31.82 and 82.50°C/km, with an average of 51.21°C/km, while heat flow variations range from 79.54 to 206.26 mW/m², with an average of 128.02 mW/m². Shallow CPD zones that run from the eastern through the western and southwestern parts of the study area correspond to zones of high geothermal gradient values and high subsurface heat flow distributions. These areas signify zones associated with anomalous subsurface thermal conditions and are therefore recommended for detailed geothermal energy exploration studies.Keywords: geothermal energy, curie-point depth, geothermal gradient, heat flow, aeromagnetic data, LBT
Procedia PDF Downloads 7823586 Detailed Depositional Resolutions in Upper Miocene Sands of HT-3X Well, Nam Con Son Basin, Vietnam
Authors: Vo Thi Hai Quan
Abstract:
Nam Con Son sedimentary basin is one of the very important oil and gas basins in offshore Vietnam. Hai Thach field of block 05-2 contains mostly gas accumulations in fine-grained, sand/mud-rich turbidite system, which was deposited in a turbidite channel and fan environment. Major Upper Miocene reservoir of HT-3X lies above a well-developed unconformity. The main objectives of this study are to reconstruct depositional environment and to assess the reservoir quality using data from 14 meters of core samples and digital wireline data of the well HT-3X. The wireline log and core data showed that the vertical sequences of representative facies of the well mainly range from Tb to Te divisions of Bouma sequences with predominance of Tb and Tc compared to Td and Te divisions. Sediments in this well were deposited in a submarine fan association with very fine to fine-grained, homogeneous sandstones that have high porosity and permeability, high- density turbidity currents with longer transport route from the sediment source to the basin, indicating good quality of reservoir. Sediments are comprised mainly of the following sedimentary structures: massive, laminated sandstones, convoluted bedding, laminated ripples, cross-laminated ripples, deformed sandstones, contorted bedding.Keywords: Hai Thach field, Miocene sand, turbidite, wireline data
Procedia PDF Downloads 29223585 Impediments to Female Sports Management and Participation: The Experience in the Selected Nigeria South West Colleges of Education
Authors: Saseyi Olaitan Olaoluwa, Osifeko Olalekan Remigious
Abstract:
The study was meant to identify the impediments to female sports management and participation in the selected colleges. Seven colleges of education in the south west parts of the country were selected for the study. A total of one hundred and five subjects were sampled to supply data. Only one hundred adequately completed and returned, copies of the questionnaire were used for data analysis. The collected data were analysed descriptively. The result of the study showed that inadequate fund, personnel, facilities equipment, supplies, management of sports, supervision and coaching were some of the impediments to female sports management and participation. Athletes were not encouraged to participate. Based on the findings, it was recommended that the government should come to the aid of the colleges by providing fund and other needs that will make sports attractive for enhanced participation.Keywords: female sports, impediments, management, Nigeria, south west, colleges
Procedia PDF Downloads 409