Search results for: incomplete count data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25305

Search results for: incomplete count data

24105 Churn Prediction for Telecommunication Industry Using Artificial Neural Networks

Authors: Ulas Vural, M. Ergun Okay, E. Mesut Yildiz

Abstract:

Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.

Keywords: customer relationship management, churn prediction, telecom industry, deep learning, artificial neural networks

Procedia PDF Downloads 139
24104 The Face Sync-Smart Attendance

Authors: Bekkem Chakradhar Reddy, Y. Soni Priya, Mathivanan G., L. K. Joshila Grace, N. Srinivasan, Asha P.

Abstract:

Currently, there are a lot of problems related to marking attendance in schools, offices, or other places. Organizations tasked with collecting daily attendance data have numerous concerns. There are different ways to mark attendance. The most commonly used method is collecting data manually by calling each student. It is a longer process and problematic. Now, there are a lot of new technologies that help to mark attendance automatically. It reduces work and records the data. We have proposed to implement attendance marking using the latest technologies. We have implemented a system based on face identification and analyzing faces. The project is developed by gathering faces and analyzing data, using deep learning algorithms to recognize faces effectively. The data is recorded and forwarded to the host through mail. The project was implemented in Python and Python libraries used are CV2, Face Recognition, and Smtplib.

Keywords: python, deep learning, face recognition, CV2, smtplib, Dlib.

Procedia PDF Downloads 47
24103 Geographical Data Visualization Using Video Games Technologies

Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava

Abstract:

In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.

Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material

Procedia PDF Downloads 238
24102 Nonparametric Sieve Estimation with Dependent Data: Application to Deep Neural Networks

Authors: Chad Brown

Abstract:

This paper establishes general conditions for the convergence rates of nonparametric sieve estimators with dependent data. We present two key results: one for nonstationary data and another for stationary mixing data. Previous theoretical results often lack practical applicability to deep neural networks (DNNs). Using these conditions, we derive convergence rates for DNN sieve estimators in nonparametric regression settings with both nonstationary and stationary mixing data. The DNN architectures considered adhere to current industry standards, featuring fully connected feedforward networks with rectified linear unit activation functions, unbounded weights, and a width and depth that grows with sample size.

Keywords: sieve extremum estimates, nonparametric estimation, deep learning, neural networks, rectified linear unit, nonstationary processes

Procedia PDF Downloads 30
24101 Development of Risk Management System for Urban Railroad Underground Structures and Surrounding Ground

Authors: Y. K. Park, B. K. Kim, J. W. Lee, S. J. Lee

Abstract:

To assess the risk of the underground structures and surrounding ground, we collect basic data by the engineering method of measurement, exploration and surveys and, derive the risk through proper analysis and each assessment for urban railroad underground structures and surrounding ground including station inflow. Basic data are obtained by the fiber-optic sensors, MEMS sensors, water quantity/quality sensors, tunnel scanner, ground penetrating radar, light weight deflectometer, and are evaluated if they are more than the proper value or not. Based on these data, we analyze the risk level of urban railroad underground structures and surrounding ground. And we develop the risk management system to manage efficiently these data and to support a convenient interface environment at input/output of data.

Keywords: urban railroad, underground structures, ground subsidence, station inflow, risk

Procedia PDF Downloads 329
24100 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 251
24099 Integrated Model for Enhancing Data Security Performance in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 470
24098 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 126
24097 Large-Scale Screening for Membrane Protein Interactions Involved in Platelet-Monocyte Interactions

Authors: Yi Sun, George Ed Rainger, Steve P. Watson

Abstract:

Background: Beyond the classical roles in haemostasis and thrombosis, platelets are important in the initiation and development of various thrombo-inflammatory diseases. In atherosclerosis and deep vein thrombosis, for example, platelets bridge monocytes with endothelium and form heterotypic aggregates with monocytes in the circulation. This can alter monocyte phenotype by inducing their activation, stimulating adhesion and migration. These interactions involve cell surface receptor-ligand pairs on both cells. This list is likely incomplete as new interactions of importance to platelet biology are continuing to be discovered as illustrated by our discovery of PEAR-1 binding to FcεR1α. Results: We have developed a highly sensitive avidity-based assay to identify novel extracellular interactions among 126 recombinantly-expressed platelet cell surface and secreted proteins involved in platelet aggregation. In this study, we will use this method to identify novel platelet-monocyte interactions. We aim to identify ligands for orphan receptors and novel partners of well-known proteins. Identified interactions will be studied in preliminary functional assays to demonstrate relevance to the inflammatory processes supporting atherogenesis. Conclusions: Platelet-monocyte interactions are essential for the development of thromboinflammatory disease. Up until relatively recently, technologies only allow us to limit our studies on each individual protein interaction at a single time. These studies propose for the first time to study the cell surface platelet-monocyte interactions in a systematic large-scale approach using a reliable screening method we have developed. If successful, this will likely to identify previously unknown ligands for important receptors that will be investigated in details and also provide a list of novel interactions for the field. This should stimulate studies on developing alternative therapeutic strategies to treat vascular inflammatory disorders such as atherosclerosis, DVT and sepsis and other clinically important inflammatory conditions.

Keywords: membrane proteins, large-scale screening, platelets, recombinant expression

Procedia PDF Downloads 148
24096 Challenges in Multi-Cloud Storage Systems for Mobile Devices

Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta

Abstract:

The demand for cloud storage is increasing because users want continuous access their data. Cloud Storage revolutionized the way how users access their data. A lot of cloud storage service providers are available as DropBox, G Drive, and providing limited free storage and for extra storage; users have to pay money, which will act as a burden on users. To avoid the issue of limited free storage, the concept of Multi Cloud Storage introduced. In this paper, we will discuss the limitations of existing Multi Cloud Storage systems for mobile devices.

Keywords: cloud storage, data privacy, data security, multi cloud storage, mobile devices

Procedia PDF Downloads 691
24095 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches

Authors: Wuttigrai Ngamsirijit

Abstract:

Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.    

Keywords: decision making, human capital analytics, talent management, talent value chain

Procedia PDF Downloads 177
24094 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 257
24093 Formulation and Evaluation of Solid Dispersion of an Anti-Epileptic Drug Carbamazepine

Authors: Sharmin Akhter, M. Salahuddin, Sukalyan Kumar Kundu, Mohammad Fahim Kadir

Abstract:

Relatively insoluble candidate drug like carbamazepine (CBZ) often exhibit incomplete or erratic absorption; and hence wide consideration is given to improve aqueous solubility of such compound. Solid dispersions were formulated with an aim of improving aqueous solubility, oral bioavailability and the rate of dissolution of Carbamazepine using different hydrophyllic polymer like Polyethylene Glycol (PEG) 6000, Polyethylene Glycol (PEG) 4000, kollidon 30, HPMC 6 cps, poloxamer 407 and povidone k 30. Solid dispersions were prepared with different drug to polymer weight ratio by the solvent evaporation method where methanol was used as solvent. Drug-polymer physical mixtures were also prepared to compare the rate of dissolution. Effects of different polymer were studied for solid dispersion formulation as well as physical mixtures. These formulations were characterized in the solid state by Fourier Transform Infrared (FTIR) spectroscopy and Scanning Electron Microscopy (SEM). Solid state characterization indicated CBZ was present as fine particles and entrapped in carrier matrix of PEG 6000 and PVP K30 solid dispersions. Fourier Transform Infrared (FTIR) spectroscopic studies showed the stability of CBZ and absence of well-defined drug-polymer interactions. In contrast to the very slow dissolution rate of pure CBZ, dispersions of drug in polymers considerably improved the dissolution rate. This can be attributed to increased wettability and dispersibility, as well as decreased crystallinity and increase in amorphous fraction of drug. Solid dispersion formulations containing PEG 6000 and Povidone K 30 showed maximum drug release within one hour at the ratio of 1:1:1. Even physical mixtures of CBZ prepared with both carriers also showed better dissolution profiles than those of pure CBZ. In conclusions, solid dispersions could be a promising delivery of CBZ with improved oral bioavailability and immediate release profiles.

Keywords: carbamazepine, FTIR, kollidon 30, HPMC 6 CPS, PEG 6000, PEG 4000, poloxamer 407, water solubility, povidone k 30, SEM, solid dispersion

Procedia PDF Downloads 289
24092 Sampled-Data Model Predictive Tracking Control for Mobile Robot

Authors: Wookyong Kwon, Sangmoon Lee

Abstract:

In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.

Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV

Procedia PDF Downloads 305
24091 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data

Authors: Nasser A. Al-Azri

Abstract:

The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.

Keywords: bioclimatic charts, passive cooling, TMY, weather data

Procedia PDF Downloads 236
24090 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach

Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon

Abstract:

Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.

Keywords: data mining, defensive m&s, management system, knowledge management

Procedia PDF Downloads 244
24089 Phytoplankton Diversity and Abundance in Burullus Lagoon, Southern Mediterranean Coast, Egypt

Authors: Shymaa S. Zaher, Hesham M. Abd El-Fatah, Dina M. Ali

Abstract:

Burullus Lagoon is the second largest lake, along the Mediterranean seashore. It exposed to over nutrient enrichment from fish farming and agricultural drainage wastes. This study assesses the present status phytoplankton response to different flow events, including domestic, agricultural, industrial, and fish farms discharge in the three main sectors of Burullus Lagoon, to focus on the influence of environmental variables on phytoplankton species composition inhabiting the Lagoon. Twelve sites representing the eastern, central, and western basin were selected during winter and summer 2018. Among the most abundant group, Chlorophyceae came in the first rank by 37.9% of the total phytoplankton densities, Bacillariophyceae (29.31%), Cyanophyceae (20.7%), Euglenophyceae (8.63%) and Dinophyceae (3.4%). Cyclotella menenghiana was the most abundant diatoms, while Scenedesmus quadricauda, S. acuminatus, and S. bijuga were highly recorded nearby the drains (in the middle sector). Phytoplankton in Burullus Lagoon attained the lowest values during the winter season and the highest ones during the summer season. The total count of phytoplankton in the middle and western basin of the lake was higher than that of the eastern part. Excessive use of chemical fertilizers, pesticides, and washing out of nutrients loaded to the drainage water, leading to a significant pronounced decrease in community composition and standing crop of phytoplankton in Burullus Lake from year to year, hold the danger of shifting the lagoon ecosystem.

Keywords: Burullus Lagoon, environmental variables, phytoplankton, water pollution

Procedia PDF Downloads 113
24088 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 232
24087 Imputation of Urban Movement Patterns Using Big Data

Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson

Abstract:

Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.

Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population

Procedia PDF Downloads 224
24086 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 26
24085 Establish Co-Culture System of Dehalococcoides and Sulfate-Reducing Bacteria to Generate Ferrous Sulfide for Reversing Sulfide-Inhibited Reductive Dechlorination

Authors: Po-Sheng Kuo, Che-Wei Lu, Ssu-Ching Chen

Abstract:

Chlorinated ethenes (CEs) constitute a predominant contaminant in Taiwan's native polluted sites, particularly in groundwater inundated with sulfate salts that substantially impede remediation efforts. The reduction of sulfate by sulfate-reducing bacteria (SRB) impairs the dechlorination efficiency of Dehalococcoides by generating hydrogen sulfide (H₂S), resulting in incomplete chloride degradation and thereby leading to the failure of bioremediation. In order to elucidate interactions between sulfate reduction and dechlorination, this study aims to establish a co-culture system of Dehalococcoides and SRB, overcoming H₂S inhibition by employing the synthesis of ferrous sulfide (FeS), which is commonly utilized in chemical remediation due to its high reduction potential. Initially, the study demonstrates that the addition of ferrous chloride (FeCl₂) effectively removed H₂S production from SRB and enhanced the degradation of trichloroethylene to ethene. This process overcomes the inhibition caused by H₂S produced by SRB in high sulfate environments. Compared to different concentrations of ferrous dosages for the biogenic generation of FeS, the efficiency was optimized by adding FeCl₂ at an equal ratio to the concentration of sulfate in the environment. This was more effective in removing H₂S and crystal particles under 10 times smaller than those synthesized under excessive FeCl₂ dosages, addressing clogging issues in situ remediation. Finally, utilizing Taiwan's indigenous dechlorinating consortium in a simulated high sulfate-contaminated environment, the biodiversity of microbial species was analyzed to reveal a higher species richness within the FeS group, conducive to ecological stability. This study validates the potential of the co-culture system in generating biogenic FeS under sulfate and CEs co-contamination, removing sulfate-reducing products, and improving CE remediation through integrated chemical and biological remediations.

Keywords: biogenic ferrous sulfide, chlorinated ethenes, Dehalococcoides, sulfate-reducing bacteria, sulfide inhibition

Procedia PDF Downloads 46
24084 The Influence of Housing Choice Vouchers on the Private Rental Market

Authors: Randy D. Colon

Abstract:

Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.

Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market

Procedia PDF Downloads 110
24083 The Rupture of Tendon Achilles During the Recreative and Sports Activities

Authors: Jasmin S. Nurkovic, Ljubisa Dj. Jovasevic, Zana C. Dolicanin, Zoran S. Bajin

Abstract:

Ruptured muscles and tendons very often must be repatriated by open operation in young persons. In young, muscles are ruptured more often than tendons, at the sane time in older persons are more exposed to rupture than muscles. Ruptured of the calcaneus are the most present of all ruptures. Sometime the rupture is complete, but very often the incomplete rupture can be noticed. During six years, from 2006 to 2012, we treated nineteen male patients and three female patients with the rupture of tendon Achilles. The youngest patient was aged thirty two, and the oldest was also managed sixty four. The youngest female patient was forty one and the oldest was forty six. One of our patients who was under corticosteroid treatment did not take any part in sport activities but she was, as she told us, going for a long walk, the same was with other two patients one man and one woman. We had nineteen male patients age 32 to 64 and three female patients age 41, 44 and 46. Conservative treatment by cast was applied in five patients and very good results were in three of them. In two patients surgical treatment failed in patient’s age 53 and 64. Only one of all patients treated by surgery had healing problems because of necrotic changes of the skin where incision was made. One of our female patients age 45 was under steroid treatment for almost 20 years because of asthmatic problems. We suggested her wearing boots with 8cm long heels by day and by night eight weeks. The final results were satisfactory and all the time she was able to work and to walk. It was the only case we had with bilateral tendon rupture. After eight weeks the cast is removed and psychiatric treatment started, patient is using crutches with partial weight bearing over a period of two weeks. Quite the same treatment conservative treatment, only the cast is not removed after two but after four weeks. Everyday activities after the surgical treatment started ten weeks and sport activities can start after fourteen to sixteen weeks. An increased activity of our patient without previous preparing for forces activity can result, as we already see, with tendon rupture. Treatment is very long and very often surgical. We find that surgical treatment resulted as safer and better solution for patients. We also had a patient with spontaneous rupture of tendon during longer walking but this patient was under prolonged corticosteroid treatment.

Keywords: tendon, Achilles, rupture, sport

Procedia PDF Downloads 239
24082 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 56
24081 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation

Procedia PDF Downloads 333
24080 Comparative Ante-Mortem Studies through Electrochemical Impedance Spectroscopy, Differential Voltage Analysis and Incremental Capacity Analysis on Lithium Ion Batteries

Authors: Ana Maria Igual-Munoz, Juan Gilabert, Marta Garcia, Alfredo Quijano-Lopez

Abstract:

Nowadays, several lithium-ion battery technologies are being commercialized. These chemistries present different properties that make them more suitable for different purposes. However, comparative studies showing the advantages and disadvantages of different chemistries are incomplete or scarce. Different non-destructive techniques are currently being employed to detect how ageing affects the active materials of lithium-ion batteries (LIBs). For instance, electrochemical impedance spectroscopy (EIS) is one of the most employed ones. This technique allows the user to identify the variations on the different resistances present in LIBs. On the other hand, differential voltage analysis (DVA) has shown to be a powerful technique to detect the processes affecting the different capacities present in LIBs. This technique shows variations in the state of health (SOH) and the capacities for one or both electrodes depending on their chemistry. Finally, incremental capacity analysis (ICA) is a widely known technique for being capable of detecting phase equilibria. It reminds of the commonly used cyclic voltamperometry, as it allows detecting some reactions taking place in the electrodes. In these studies, a set of ageing procedures have been applied to commercial batteries of different chemistries (NCA, NMC, and LFP). Afterwards, results of EIS, DVA, and ICA have been used to correlate them with the processes affecting each cell. Ciclability, overpotential, and temperature cycling studies envisage how the charge-discharge rates, cut-off voltage, and operation temperatures affect each chemistry. These studies will serve battery pack manufacturers, as for common battery users, as they will determine the different conditions affecting cells for each of the chemistry. Taking this into account, each cell could be adjusted to the final purpose of the battery application. Last but not least, all the degradation parameters observed are focused to be integrated into degradation models in the future. This fact will allow the implementation of the widely known digital twins to the degradation in LIBs.

Keywords: lithium ion batteries, non-destructive analysis, different chemistries, ante-mortem studies, ICA, DVA, EIS

Procedia PDF Downloads 123
24079 Social Data Aggregator and Locator of Knowledge (STALK)

Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat

Abstract:

Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.

Keywords: social network, analysis, Facebook, Linkedin, git, big data

Procedia PDF Downloads 435
24078 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 217
24077 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm

Authors: Kamel Belammi, Houria Fatrim

Abstract:

imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.

Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes

Procedia PDF Downloads 526
24076 Introduction of Digital Radiology to Improve the Timeliness in Availability of Radiological Diagnostic Images for Trauma Care

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In an emergency department ‘where every second count for patient’s management’ timely availability of X- rays play a vital role in early diagnosis and management of patients. Trauma care centers rely heavily on timely radiologic imaging for patient care and radiology plays a crucial role in the emergency department (ED) operations. A research study was carried out to assess timeliness of availability of X-rays and total turnaround time at the Accident Service of National Hospital of Sri Lanka which is the premier trauma center in the country. Digital Radiology system was implemented as an intervention to improve the timeliness of availability of X-rays. Post-implementation assessment was carried out to assess the effectiveness of the intervention. Reduction in all three aspects of waiting times namely waiting for initial examination by doctors, waiting until X –ray is performed and waiting for image availability was observed after implementation of the intervention. However, the most significant improvement was seen in waiting time for image availability and reduction in time for image availability had indirect impact on reducing waiting time for initial examination by doctors and waiting until X –ray is performed. The most significant reduction in time for image availability was observed when performing 4-5 X rays with DR system. The least improvement in timeliness was seen in patients who are categorized as critical.

Keywords: emergency department, digital radilogy, timeliness, trauma care

Procedia PDF Downloads 259