Search results for: web usage data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26399

Search results for: web usage data

23459 Urban Areas Management in Developing Countries: Analysis of the Urban Areas Crossed with Risk of Storm Water Drains, Aswan-Egypt

Authors: Omar Hamdy, Schichen Zhao, Hussein Abd El-Atty, Ayman Ragab, Muhammad Salem

Abstract:

One of the most risky areas in Aswan is Abouelreesh, which is suffering from flood disasters, as heavy deluge inundates urban areas causing considerable damage to buildings and infrastructure. Moreover, the main problem was the urban sprawl towards this risky area. This paper aims to identify the urban areas located in the risk areas prone to flash floods. Analyzing this phenomenon needs a lot of data to ensure satisfactory results; however, in this case the official data and field data were limited, and therefore, free sources of satellite data were used. This paper used ArcGIS tools to obtain the storm water drains network by analyzing DEM files. Additionally, historical imagery in Google Earth was studied to determine the age of each building. The last step was to overlay the urban area layer and the storm water drains layer to identify the vulnerable areas. The results of this study would be helpful to urban planners and government officials to make the disasters risk estimation and develop primary plans to recover the risky area, especially urban areas located in torrents.

Keywords: risk area, DEM, storm water drains, GIS

Procedia PDF Downloads 459
23458 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network

Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Sadiye Didem Boztepe Erkis, Ismail Saritas, Selma Tasdemir

Abstract:

Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modeled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the system developed, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.

Keywords: artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.

Procedia PDF Downloads 385
23457 Social Media Factor in Security Environment

Authors: Cetin Arslan, Senol Tayan

Abstract:

Social media is one of the most important and effective means of social interaction among people in which they create, share and exchange their ideas via photos, videos or voice messages. Although there are lots of communication tools, social media sites are the most prominent ones that allows the users articulate themselves in a matter of seconds all around the world with almost any expenses and thus, they became very popular and widespread after its emergence. As the usage of social media increases, it becomes an effective instrument in social matters. While it is possible to use social media to emphasize basic human rights and protest some failures of any government as in “Arab Spring”, it is also possible to spread propaganda and misinformation just to cause long lasting insurgency, upheaval, turmoil or disorder as an instrument of intervention to internal affairs and state sovereignty by some hostile groups or countries. It is certain that “social media” has positive effects on democracies letting people have chance to express themselves and to organize, but it is also obvious that the misuse of it, is very common that even a five-minute-long video can cause to wage a campaign against a country. Although it looks anti-democratic, when you consider the catastrophic effects of misuse of social media, it is a kind of area that serious precautions are to be taken without limiting democratic rights while allowing constant and perpetual share but preventing the criminal events. This article begins with the current developments in social media and gives some examples on misuse of it. Second part tries to put emphasize on the legal basis that can prevent criminal activities and the upheavals and insurgencies against state security. Last part makes comparison between democratic countries and international organizations’’ actions against such activities and proposes some further actions that are compatible with democratic norms.

Keywords: democracy, disorder, security, Social Media

Procedia PDF Downloads 366
23456 History of Textiles and Fashion: Gender Symbolism in the Context of Colour

Authors: Damayanthie Eluwawalage

Abstract:

Historically, the color-coded attire demarcated differences, for example, differences in social position and differences in gender, etc. Distinctive colors are worn by different classes in medieval England. By the twentieth-century Western society, certain colors were firmly associated with the specific gender; as pink for girls, and blue for boys. The color-coded gender phenomenon was a novelty at the turn of the twentieth-century and became widely practiced after World War II. Prior to that era, there were no distinctions or differences in the dress of younger children, in relation to their gender. In the nineteenth century, pink suits were highly acceptable for gentlemen’s attire. Frenchmen in the eighteenth-century wore colors with an infinite range of hues like pink, plum, white, cream, blue, yellow, puce and sea green. Nineteenth-century European male austerity, primarily caused by the usage of sombre colors such as black, white and grey, has been described as an element for dignity, control and morality. In the nineteenth century, there were many color-associated distinctions, as certain colors were reserved for the unmarried, the single or the aged. Two luminous colors in one dress was ‘vulgar’ and yellow was generally regarded as unladylike. Yellow was the color utilised for most correctional attire. Orange was prohibited for the unmarried. Fashionable dressing in the nineteenth century was more gender-differentiated than in previous centuries. Masculine austerity, emphasized a shift in class relations. As a result of that shift, male attire became more uniform, homogeneous and integrated (amongst the classes), than its traditional hierarchal approach.

Keywords: textiles, fashion, gender symbolism, color

Procedia PDF Downloads 492
23455 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date

Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian

Abstract:

To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.

Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven

Procedia PDF Downloads 174
23454 A Method to Estimate Wheat Yield Using Landsat Data

Authors: Zama Mahmood

Abstract:

The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation.

Keywords: landsat, NDVI, remote sensing, satellite images, yield

Procedia PDF Downloads 335
23453 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile

Procedia PDF Downloads 170
23452 Characterization of Brewery Wastewater Composition

Authors: Abimbola M. Enitan, Josiah Adeyemo, Sheena Kumari, Feroz M. Swalaha, Faizal Bux

Abstract:

With the competing demand on water resources and water reuse, discharge of industrial effluents into the aquatic environment has become an important issue. Much attention has been placed on the impact of industrial wastewater on water bodies worldwide due to the accumulation of organic and inorganic matter in the receiving water bodies. The scope of the present work is to assess the physic-chemical composition of the wastewater produced from one of the brewery industry in South Africa. This is to estimate the environmental impact of its discharge into the receiving water bodies or the municipal treatment plant. The parameters monitored for the quantitative analysis of brewery wastewater include biological oxygen demand (BOD5), chemical oxygen demand (COD), total suspended solids, volatile suspended solids, ammonia, total oxidized nitrogen, nitrate, nitrite, phosphorus, and alkalinity content. In average, the COD concentration of the brewery effluent was 5340.97 mg/l with average pH values of 4.0 to 6.7. The BOD and the solids content of the wastewater from the brewery industry were high. This means that the effluent is very rich in organic content and its discharge into the water bodies or the municipal treatment plant could cause environmental pollution or damage the treatment plant. In addition, there were variations in the wastewater composition throughout the monitoring period. This might be as a result of different activities that take place during the production process, as well as the effects of the peak period of beer production on the water usage.

Keywords: Brewery wastewater, environmental pollution, industrial effluents, physic-chemical composition

Procedia PDF Downloads 453
23451 Image Ranking to Assist Object Labeling for Training Detection Models

Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman

Abstract:

Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.

Keywords: computer vision, deep learning, object detection, semiconductor

Procedia PDF Downloads 136
23450 Molecular Engineering of High-Performance Nanofiltration Membranes from Intrinsically Microporous Poly (Ether-Ether-Ketone)

Authors: Mahmoud A. Abdulhamid

Abstract:

Poly(ether-ether-ketone) (PEEK) has received increased attention due to its outstanding performance in different membrane applications including gas and liquid separation. However, it suffers from a semi-crystalline morphology, bad solubility and low porosity. To fabricate membranes from PEEK, the usage of harsh acid such as sulfuric acid is essential, regardless its hazardous properties. In this work, we report the molecular design of poly(ether-ether-ketones) (iPEEKs) with intrinsic porosity character, by incorporating kinked units into PEEK backbone such as spirobisindane, Tröger's base, and triptycene. The porous polymers were used to fabricate stable membranes for organic solvent nanofiltration application. To better understand the mechanism, we conducted molecular dynamics simulations to evaluate the possible interactions between the polymers and the solvents. Notable enhancement in separation performance was observed confirming the importance of molecular engineering of high-performance polymers. The iPEEKs demonstrated good solubility in polar aprotic solvents, a high surface area of 205–250 m² g⁻¹, and excellent thermal stability. Mechanically flexible nanofiltration membranes were prepared from N-methyl-2-pyrrolidone dope solution at iPEEK concentrations of 19–35 wt%. The molecular weight cutoff of the membranes was fine-tuned in the range of 450–845 g mol⁻¹ displaying 2–6 fold higher permeance (3.57–11.09 L m⁻² h⁻¹ bar⁻¹) than previous reports. The long-term stabilities were demonstrated by a 7 day continuous cross-flow filtration.

Keywords: molecular engineering, polymer synthesis, membrane fabrication, liquid separation

Procedia PDF Downloads 96
23449 From Text to Data: Sentiment Analysis of Presidential Election Political Forums

Authors: Sergio V Davalos, Alison L. Watkins

Abstract:

User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.

Keywords: sentiment analysis, text mining, user generated content, US presidential elections

Procedia PDF Downloads 192
23448 Green Extraction of Patchoulol from Patchouli Leaves Using Ultrasound-Assisted Ionic Liquids

Authors: G. C. Jadeja, M. A. Desai, D. R. Bhatt, J. K. Parikh

Abstract:

Green extraction techniques are fast paving ways into various industrial sectors due to the stringent governmental regulations leading to the banning of toxic chemicals’ usage and also due to the increasing health/environmental awareness. The present work describes the ionic liquids based sonication method for selectively extracting patchoulol from the leaves of patchouli. 1-Butyl-3-methylimidazolium tetrafluoroborate ([Bmim]BF4) and N,N,N,N’,N’,N’-Hexaethyl-butane-1,4-diammonium dibromide (dicationic ionic liquid - DIL) were selected for extraction. Ultrasound assisted ionic liquid extraction was employed considering concentration of ionic liquid (4–8 %, w/w), ultrasound power (50–150 W for [Bmim]BF4 and 20–80 W for DIL), temperature (30–50 oC) and extraction time (30–50 min) as major parameters influencing the yield of patchoulol. Using the Taguchi method, the parameters were optimized and analysis of variance (ANOVA) was performed to find the most influential factor in the selected extraction method. In case of [Bmim]BF4, the optimum conditions were found to be: 4 % (w/w) ionic liquid concentration, 50 W power, 30 oC temperature and extraction time of 30 min. The yield obtained under the optimum conditions was 3.99 mg/g. In case of DIL, the optimum conditions were obtained as 6 % (w/w) ionic liquid concentration, 80 W power, 30 oC temperature and extraction time of 40 min, for which the yield obtained was 4.03 mg/g. Temperature was found to be the most significant factor in both the cases. Extraction time was the insignificant parameter while extracting the product using [Bmim]BF4 and in case of DIL, power was found to be the least significant factor affecting the process. Thus, a green method of recovering patchoulol is proposed.

Keywords: green extraction, ultrasound, patchoulol, ionic liquids

Procedia PDF Downloads 363
23447 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility

Authors: Alejandro Villegas, Cihan Varol

Abstract:

Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.

Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise

Procedia PDF Downloads 297
23446 Ion Thruster Grid Lifetime Assessment Based on Its Structural Failure

Authors: Juan Li, Jiawen Qiu, Yuchuan Chu, Tianping Zhang, Wei Meng, Yanhui Jia, Xiaohui Liu

Abstract:

This article developed an ion thruster optic system sputter erosion depth numerical 3D model by IFE-PIC (Immersed Finite Element-Particle-in-Cell) and Mont Carlo method, and calculated the downstream surface sputter erosion rate of accelerator grid; Compared with LIPS-200 life test data, the results of the numerical model are in reasonable agreement with the measured data. Finally, we predict the lifetime of the 20cm diameter ion thruster via the erosion data obtained with the model. The ultimate result demonstrates that under normal operating condition, the erosion rate of the grooves wears on the downstream surface of the accelerator grid is 34.6μm⁄1000h, which means the conservative lifetime until structural failure occurring on the accelerator grid is 11500 hours.

Keywords: ion thruster, accelerator gird, sputter erosion, lifetime assessment

Procedia PDF Downloads 565
23445 Nutrient Foramina of the Lunate Bone of the Hand – an Anatomical Study

Authors: P.J. Jiji, B.V. Murlimanju, Latha V. Prabhu, Mangala M. Pai

Abstract:

Background: The lunate bone dislocation can lead to the compression of the median nerve and subsequent carpal tunnel syndrome. The dislocation can interrupt the vasculature and would cause avascular necrosis. The objective of the present study was to study the morphology and number of the nutrient foramina in the cadaveric dried lunate bones of the Indian population. Methods: The present study included 28 lunate bones (13 right sided and 15 left sided) which were obtained from the gross anatomy laboratory of our institution. The bones were macroscopically observed for the nutrient foramina and the data was collected with respect to their number. The tabulation of the data and analysis were done. Results: All of our specimens (100%) exhibited the nutrient foramina over the non-articular surfaces. The foramina were observed only over the palmar and dorsal surfaces of the lunate bones. The foramen ranged between 2 and 10. The foramina were more in number over the dorsal surface (average number 3.3) in comparison to the palmar surface (average number 2.4). Conclusion: We believe that the present study has provided important data about the nutrient foramina of the lunate bones. The data is enlightening to the orthopedic surgeon and would help in the hand surgeries. The morphological knowledge of the vasculature, their foramina of entry and their number is required to understand the concepts in the lunatomalacia and Kienbock’s disease.

Keywords: avascular necrosis, foramen, lunate, nutrient

Procedia PDF Downloads 244
23444 Efficacy of Technology for Successful Learning Experience; Technology Supported Model for Distance Learning: Case Study of Botho University, Botswana

Authors: Ivy Rose Mathew

Abstract:

The purpose of this study is to outline the efficacy of technology and the opportunities it can bring to implement a successful delivery model in Distance Learning. Distance Learning has proliferated over the past few years across the world. Some of the current challenges faced by current students of distance education include lack of motivation, a sense of isolation and a need for greater and improved communication. Hence the author proposes a creative technology supported model for distance learning exactly mirrored on the traditional face to face learning that can be adopted by distance learning providers. This model suggests the usage of a range of technologies and social networking facilities, with the aim of creating a more engaging and sustaining learning environment to help overcome the isolation often noted by distance learners. While discussing the possibilities, the author also highlights the complexity and practical challenges of implementing such a model. Design/methodology/approach: Theoretical issues from previous research related to successful models for distance learning providers will be considered. And also the analysis of a case study from one of the largest private tertiary institution in Botswana, Botho University will be included. This case study illustrates important aspects of the distance learning delivery model and provides insights on how curriculum development is planned, quality assurance is done, and learner support is assured for successful distance learning experience. Research limitations/implications: While some of the aspects of this study may not be applicable to other contexts, a number of new providers of distance learning can adapt the key principles of this delivery model.

Keywords: distance learning, efficacy, learning experience, technology supported model

Procedia PDF Downloads 247
23443 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit

Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana

Abstract:

Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.

Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification

Procedia PDF Downloads 156
23442 Big Data Applications for the Transport Sector

Authors: Antonella Falanga, Armando Cartenì

Abstract:

Today, an unprecedented amount of data coming from several sources, including mobile devices, sensors, tracking systems, and online platforms, characterizes our lives. The term “big data” not only refers to the quantity of data but also to the variety and speed of data generation. These data hold valuable insights that, when extracted and analyzed, facilitate informed decision-making. The 4Vs of big data - velocity, volume, variety, and value - highlight essential aspects, showcasing the rapid generation, vast quantities, diverse sources, and potential value addition of these kinds of data. This surge of information has revolutionized many sectors, such as business for improving decision-making processes, healthcare for clinical record analysis and medical research, education for enhancing teaching methodologies, agriculture for optimizing crop management, finance for risk assessment and fraud detection, media and entertainment for personalized content recommendations, emergency for a real-time response during crisis/events, and also mobility for the urban planning and for the design/management of public and private transport services. Big data's pervasive impact enhances societal aspects, elevating the quality of life, service efficiency, and problem-solving capacities. However, during this transformative era, new challenges arise, including data quality, privacy, data security, cybersecurity, interoperability, the need for advanced infrastructures, and staff training. Within the transportation sector (the one investigated in this research), applications span planning, designing, and managing systems and mobility services. Among the most common big data applications within the transport sector are, for example, real-time traffic monitoring, bus/freight vehicle route optimization, vehicle maintenance, road safety and all the autonomous and connected vehicles applications. Benefits include a reduction in travel times, road accidents and pollutant emissions. Within these issues, the proper transport demand estimation is crucial for sustainable transportation planning. Evaluating the impact of sustainable mobility policies starts with a quantitative analysis of travel demand. Achieving transportation decarbonization goals hinges on precise estimations of demand for individual transport modes. Emerging technologies, offering substantial big data at lower costs than traditional methods, play a pivotal role in this context. Starting from these considerations, this study explores the usefulness impact of big data within transport demand estimation. This research focuses on leveraging (big) data collected during the COVID-19 pandemic to estimate the evolution of the mobility demand in Italy. Estimation results reveal in the post-COVID-19 era, more than 96 million national daily trips, about 2.6 trips per capita, with a mobile population of more than 37.6 million Italian travelers per day. Overall, this research allows us to conclude that big data better enhances rational decision-making for mobility demand estimation, which is imperative for adeptly planning and allocating investments in transportation infrastructures and services.

Keywords: big data, cloud computing, decision-making, mobility demand, transportation

Procedia PDF Downloads 64
23441 Evaluation of Liquid Fermentation Strategies to Obtain a Biofertilizer Based on Rhizobium sp.

Authors: Andres Diaz Garcia, Ana Maria Ceballos Rojas, Duvan Albeiro Millan Montano

Abstract:

This paper describes the initial technological development stages in the area of liquid fermentation required to reach the quantities of biomass of the biofertilizer microorganism Rhizobium sp. strain B02, for the application of the unitary stages downstream at laboratory scale. In the first stage, the adjustment and standardization of the fermentation process in conventional batch mode were carried out. In the second stage, various fed-batch and continuous fermentation strategies were evaluated in 10L-bioreactor in order to optimize the yields in concentration (Colony Forming Units/ml•h) and biomass (g/l•h), to make feasible the application of unit operations downstream of process. The growth kinetics, the evolution of dissolved oxygen and the pH profile generated in each of the strategies were monitored and used to make sequential adjustments. Once the fermentation was finished, the final concentration and viability of the obtained biomass were determined and performance parameters were calculated with the purpose of select the optimal operating conditions that significantly improved the baseline results. Under the conditions adjusted and standardized in batch mode, concentrations of 6.67E9 CFU/ml were reached after 27 hours of fermentation and a subsequent noticeable decrease was observed associated with a basification of the culture medium. By applying fed-batch and continuous strategies, significant increases in yields were achieved, but with similar concentration levels, which involved the design of several production scenarios based on the availability of equipment usage time and volume of required batch.

Keywords: biofertilizer, liquid fermentation, Rhizobium sp., standardization of processes

Procedia PDF Downloads 177
23440 ISME: Integrated Style Motion Editor for 3D Humanoid Character

Authors: Ismahafezi Ismail, Mohd Shahrizal Sunar

Abstract:

The motion of a realistic 3D humanoid character is very important especially for the industries developing computer animations and games. However, this type of motion is seen with a very complex dimensional data as well as body position, orientation, and joint rotation. Integrated Style Motion Editor (ISME), on the other hand, is a method used to alter the 3D humanoid motion capture data utilised in computer animation and games development. Therefore, this study was carried out with the purpose of demonstrating a method that is able to manipulate and deform different motion styles by integrating Key Pose Deformation Technique and Trajectory Control Technique. This motion editing method allows the user to generate new motions from the original motion capture data using a simple interface control. Unlike the previous method, our method produces a realistic humanoid motion style in real time.

Keywords: computer animation, humanoid motion, motion capture, motion editing

Procedia PDF Downloads 382
23439 Effect of Traffic Volume and Its Composition on Vehicular Speed under Mixed Traffic Conditions: A Kriging Based Approach

Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh

Abstract:

Use of speed prediction models sometimes appears as a feasible alternative to laborious field measurement particularly, in case when field data cannot fulfill designer’s requirements. However, developing speed models is a challenging task specifically in the context of developing countries like India where vehicles with diverse static and dynamic characteristics use the same right of way without any segregation. Here the traffic composition plays a significant role in determining the vehicular speed. The present research was carried out to examine the effects of traffic volume and its composition on vehicular speed under mixed traffic conditions. Classified traffic volume and speed data were collected from different geometrically identical six lane divided arterials in New Delhi. Based on these field data, speed prediction models were developed for individual vehicle category adopting Kriging approximation technique, an alternative for commonly used regression. These models are validated with the data set kept aside earlier for validation purpose. The predicted speeds showed a great deal of agreement with the observed values and also the model outperforms all other existing speed models. Finally, the proposed models were utilized to evaluate the effect of traffic volume and its composition on speed.

Keywords: speed, Kriging, arterial, traffic volume

Procedia PDF Downloads 353
23438 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO

Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu

Abstract:

Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.

Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO

Procedia PDF Downloads 91
23437 Impact of Transitioning to Renewable Energy Sources on Key Performance Indicators and Artificial Intelligence Modules of Data Center

Authors: Ahmed Hossam ElMolla, Mohamed Hatem Saleh, Hamza Mostafa, Lara Mamdouh, Yassin Wael

Abstract:

Artificial intelligence (AI) is reshaping industries, and its potential to revolutionize renewable energy and data center operations is immense. By harnessing AI's capabilities, we can optimize energy consumption, predict fluctuations in renewable energy generation, and improve the efficiency of data center infrastructure. This convergence of technologies promises a future where energy is managed more intelligently, sustainably, and cost-effectively. The integration of AI into renewable energy systems unlocks a wealth of opportunities. Machine learning algorithms can analyze vast amounts of data to forecast weather patterns, solar irradiance, and wind speeds, enabling more accurate energy production planning. AI-powered systems can optimize energy storage and grid management, ensuring a stable power supply even during intermittent renewable generation. Moreover, AI can identify maintenance needs for renewable energy infrastructure, preventing costly breakdowns and maximizing system lifespan. Data centers, which consume substantial amounts of energy, are prime candidates for AI-driven optimization. AI can analyze energy consumption patterns, identify inefficiencies, and recommend adjustments to cooling systems, server utilization, and power distribution. Predictive maintenance using AI can prevent equipment failures, reducing energy waste and downtime. Additionally, AI can optimize data placement and retrieval, minimizing energy consumption associated with data transfer. As AI transforms renewable energy and data center operations, modified Key Performance Indicators (KPIs) will emerge. Traditional metrics like energy efficiency and cost-per-megawatt-hour will continue to be relevant, but additional KPIs focused on AI's impact will be essential. These might include AI-driven cost savings, predictive accuracy of energy generation and consumption, and the reduction of carbon emissions attributed to AI-optimized operations. By tracking these KPIs, organizations can measure the success of their AI initiatives and identify areas for improvement. Ultimately, the synergy between AI, renewable energy, and data centers holds the potential to create a more sustainable and resilient future. By embracing these technologies, we can build smarter, greener, and more efficient systems that benefit both the environment and the economy.

Keywords: data center, artificial intelligence, renewable energy, energy efficiency, sustainability, optimization, predictive analytics, energy consumption, energy storage, grid management, data center optimization, key performance indicators, carbon emissions, resiliency

Procedia PDF Downloads 35
23436 dynr.mi: An R Program for Multiple Imputation in Dynamic Modeling

Authors: Yanling Li, Linying Ji, Zita Oravecz, Timothy R. Brick, Michael D. Hunter, Sy-Miin Chow

Abstract:

Assessing several individuals intensively over time yields intensive longitudinal data (ILD). Even though ILD provide rich information, they also bring other data analytic challenges. One of these is the increased occurrence of missingness with increased study length, possibly under non-ignorable missingness scenarios. Multiple imputation (MI) handles missing data by creating several imputed data sets, and pooling the estimation results across imputed data sets to yield final estimates for inferential purposes. In this article, we introduce dynr.mi(), a function in the R package, Dynamic Modeling in R (dynr). The package dynr provides a suite of fast and accessible functions for estimating and visualizing the results from fitting linear and nonlinear dynamic systems models in discrete as well as continuous time. By integrating the estimation functions in dynr and the MI procedures available from the R package, Multivariate Imputation by Chained Equations (MICE), the dynr.mi() routine is designed to handle possibly non-ignorable missingness in the dependent variables and/or covariates in a user-specified dynamic systems model via MI, with convergence diagnostic check. We utilized dynr.mi() to examine, in the context of a vector autoregressive model, the relationships among individuals’ ambulatory physiological measures, and self-report affect valence and arousal. The results from MI were compared to those from listwise deletion of entries with missingness in the covariates. When we determined the number of iterations based on the convergence diagnostics available from dynr.mi(), differences in the statistical significance of the covariate parameters were observed between the listwise deletion and MI approaches. These results underscore the importance of considering diagnostic information in the implementation of MI procedures.

Keywords: dynamic modeling, missing data, mobility, multiple imputation

Procedia PDF Downloads 164
23435 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 124
23434 Annual Water Level Simulation Using Support Vector Machine

Authors: Maryam Khalilzadeh Poshtegal, Seyed Ahmad Mirbagheri, Mojtaba Noury

Abstract:

In this paper, by application of the input yearly data of rainfall, temperature and flow to the Urmia Lake, the simulation of water level fluctuation were applied by means of three models. According to the climate change investigation the fluctuation of lakes water level are of high interest. This study investigate data-driven models, support vector machines (SVM), SVM method which is a new regression procedure in water resources are applied to the yearly level data of Lake Urmia that is the biggest and the hyper saline lake in Iran. The evaluated lake levels are found to be in good correlation with the observed values. The results of SVM simulation show better accuracy and implementation. The mean square errors, mean absolute relative errors and determination coefficient statistics are used as comparison criteria.

Keywords: simulation, water level fluctuation, urmia lake, support vector machine

Procedia PDF Downloads 368
23433 Estimation of State of Charge, State of Health and Power Status for the Li-Ion Battery On-Board Vehicle

Authors: S. Sabatino, V. Calderaro, V. Galdi, G. Graber, L. Ippolito

Abstract:

Climate change is a rapidly growing global threat caused mainly by increased emissions of carbon dioxide (CO₂) into the atmosphere. These emissions come from multiple sources, including industry, power generation, and the transport sector. The need to tackle climate change and reduce CO₂ emissions is indisputable. A crucial solution to achieving decarbonization in the transport sector is the adoption of electric vehicles (EVs). These vehicles use lithium (Li-Ion) batteries as an energy source, making them extremely efficient and with low direct emissions. However, Li-Ion batteries are not without problems, including the risk of overheating and performance degradation. To ensure its safety and longevity, it is essential to use a battery management system (BMS). The BMS constantly monitors battery status, adjusts temperature and cell balance, ensuring optimal performance and preventing dangerous situations. From the monitoring carried out, it is also able to optimally manage the battery to increase its life. Among the parameters monitored by the BMS, the main ones are State of Charge (SoC), State of Health (SoH), and State of Power (SoP). The evaluation of these parameters can be carried out in two ways: offline, using benchtop batteries tested in the laboratory, or online, using batteries installed in moving vehicles. Online estimation is the preferred approach, as it relies on capturing real-time data from batteries while operating in real-life situations, such as in everyday EV use. Actual battery usage conditions are highly variable. Moving vehicles are exposed to a wide range of factors, including temperature variations, different driving styles, and complex charge/discharge cycles. This variability is difficult to replicate in a controlled laboratory environment and can greatly affect performance and battery life. Online estimation captures this variety of conditions, providing a more accurate assessment of battery behavior in real-world situations. In this article, a hybrid approach based on a neural network and a statistical method for real-time estimation of SoC, SoH, and SoP parameters of interest is proposed. These parameters are estimated from the analysis of a one-day driving profile of an electric vehicle, assumed to be divided into the following four phases: (i) Partial discharge (SoC 100% - SoC 50%), (ii) Partial discharge (SoC 50% - SoC 80%), (iii) Deep Discharge (SoC 80% - SoC 30%) (iv) Full charge (SoC 30% - SoC 100%). The neural network predicts the values of ohmic resistance and incremental capacity, while the statistical method is used to estimate the parameters of interest. This reduces the complexity of the model and improves its prediction accuracy. The effectiveness of the proposed model is evaluated by analyzing its performance in terms of square mean error (RMSE) and percentage error (MAPE) and comparing it with the reference method found in the literature.

Keywords: electric vehicle, Li-Ion battery, BMS, state-of-charge, state-of-health, state-of-power, artificial neural networks

Procedia PDF Downloads 67
23432 Dynamic Mode Decomposition and Wake Flow Modelling of a Wind Turbine

Authors: Nor Mazlin Zahari, Lian Gan, Xuerui Mao

Abstract:

The power production in wind farms and the mechanical loads on the turbines are strongly impacted by the wake of the wind turbine. Thus, there is a need for understanding and modelling the turbine wake dynamic in the wind farm and the layout optimization. Having a good wake model is important in predicting plant performance and understanding fatigue loads. In this paper, the Dynamic Mode Decomposition (DMD) was applied to the simulation data generated by a Direct Numerical Simulation (DNS) of flow around a turbine, perturbed by upstream inflow noise. This technique is useful in analyzing the wake flow, to predict its future states and to reflect flow dynamics associated with the coherent structures behind wind turbine wake flow. DMD was employed to describe the dynamic of the flow around turbine from the DNS data. Since the DNS data comes with the unstructured meshes and non-uniform grid, the interpolation of each occurring within each element in the data to obtain an evenly spaced mesh was performed before the DMD was applied. DMD analyses were able to tell us characteristics of the travelling waves behind the turbine, e.g. the dominant helical flow structures and the corresponding frequencies. As the result, the dominant frequency will be detected, and the associated spatial structure will be identified. The dynamic mode which represented the coherent structure will be presented.

Keywords: coherent structure, Direct Numerical Simulation (DNS), dominant frequency, Dynamic Mode Decomposition (DMD)

Procedia PDF Downloads 348
23431 Embolization of Spinal Dural Arteriovenous Fistulae: Clinical Outcomes and Long-Term Follow-Up: A Multicenter Study

Authors: Walid Abouzeid, Mohamed Shadad, Mostafa Farid, Magdy El Hawary

Abstract:

The most frequent treatable vascular abnormality of the spinal canal is spinal dural arteriovenous fistulae (SDAVFs), which cause progressive para- or quadriplegia mostly affecting elderly males. SDAVFs are present in the thoracolumbar region. The main goal of treatment must be to obliterate the shunting zone via superselective embolization with the usage of a liquid embolic agent. This study aims to evaluate endovascular technique as a safe and efficient approach for the treatment SDAVFs, especially with long-term follow-up clinical outcomes. Study Design: A retrospective clinical case study. From May 2010 to May 2017, 15 patients who had symptoms attributed to SDAVFs underwent the operation in the Departments of Neurosurgery in Suhag, Tanta, and Al-Azhar Universities and Interventional Radiology, Ain Shams University. All the patients had varying degrees of progressive spastic paraparesis with and without sphincteric disturbances. Endovascular embolization was used in all cases. Fourteen were males, with ages ranging from 45 to 74 years old. After the treatment, good outcome was found in five patients (33.3%), a moderate outcome was delineated in six patients (40 %), and four patients revealed a poor outcome (26.7%). Spinal AVF could be treated safely and effectively by the endovascular approach. Generally, there is no correlation between the disappearance of MRI abnormalities and significant clinical improvement. The preclinical state of the patient is directly proportional to the clinical outcome. Due to unexpected responses, embolization should be attempted even the patient is in a bad clinical condition.

Keywords: spine, arteriovenous, fistula, endovascular, embolization

Procedia PDF Downloads 108
23430 Forensic Methods Used for the Verification of the Authenticity of Prints

Authors: Olivia Rybak-Karkosz

Abstract:

This paper aims to present the results of scientific research on methods of forging art prints and their elements, such as signature or provenance and forensic science methods that might be used to verify their authenticity. In the last decades, the art market has observed significant interest in purchasing prints. They are considered an economical alternative to paintings and a considerable investment. However, the authenticity of an art print is difficult to establish as similar visual effects might be achieved with drawings or xerox. The latter is easy to make using a home printer. They are then offered on flea markets or internet auctions as genuine prints. This probable ease of forgery and, at the same time, the difficulty of distinguishing art print techniques were the main reasons why this research was undertaken. A lack of scientific methods dedicated to disclosing a forgery encouraged the author to verify the possibility of using forensic science's methods known and used in other fields of expertise. This research methodology consisted of completing representative forgery samples collected in selected museums based in Poland and a few in Germany and Austria. That allowed the author to present a typology of methods used to forge art prints. Given that one of the most famous graphic design examples is bills and securities, it seems only appropriate to propose in print verification the usage of methods of detecting counterfeit currency. These methods contain an examination of ink, paper, and watermarks. On prints, additionally, signatures and imprints of stamps, etc., are forged as well. So the examination should be completed with handwriting examination and forensic sphragistics. The paper contains a stipulation to conduct a complex analysis of authenticity with the participation of an art restorer, art historian, and forensic expert as head of this team.

Keywords: art forgery, examination of an artwork, handwriting analysis, prints

Procedia PDF Downloads 129