Search results for: data loss
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27829

Search results for: data loss

26029 Development of Risk Management System for Urban Railroad Underground Structures and Surrounding Ground

Authors: Y. K. Park, B. K. Kim, J. W. Lee, S. J. Lee

Abstract:

To assess the risk of the underground structures and surrounding ground, we collect basic data by the engineering method of measurement, exploration and surveys and, derive the risk through proper analysis and each assessment for urban railroad underground structures and surrounding ground including station inflow. Basic data are obtained by the fiber-optic sensors, MEMS sensors, water quantity/quality sensors, tunnel scanner, ground penetrating radar, light weight deflectometer, and are evaluated if they are more than the proper value or not. Based on these data, we analyze the risk level of urban railroad underground structures and surrounding ground. And we develop the risk management system to manage efficiently these data and to support a convenient interface environment at input/output of data.

Keywords: urban railroad, underground structures, ground subsidence, station inflow, risk

Procedia PDF Downloads 339
26028 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 264
26027 Integrated Model for Enhancing Data Security Performance in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 482
26026 Numerical Investigation of Effect of Throat Design on the Performance of a Rectangular Ramjet Intake

Authors: Subrat Partha Sarathi Pattnaik, Rajan N.K.S.

Abstract:

Integrated rocket ramjet engines are highly suitable for long range missile applications. Designing the fixed geometry intakes for such missiles that can operate efficiently over a range of operating conditions is a highly challenging task. Hence, the present study aims to evaluate the effect of throat design on the performance of a rectangular mixed compression intake for operation in the Mach number range of 1.8 – 2.5. The analysis has been carried out at four different Mach numbers of 1.8, 2, 2.2, 2.5 and two angle-of-attacks of +5 and +10 degrees. For the throat design, three different throat heights have been considered, one corresponding to a 3- external shock design and two heights corresponding to a 2-external shock design leading to different internal contraction ratios. The on-design Mach number for the study is M 2.2. To obtain the viscous flow field in the intake, the theoretical designs have been considered for computational fluid dynamic analysis. For which Favre averaged Navier- Stokes (FANS) equations with two equation SST k-w model have been solved. The analysis shows that for zero angle of attack at on-design and high off-design Mach number operations the three-ramp design leads to a higher total pressure recovery (TPR) compared to the two-ramp design at both contraction ratios maintaining same mass flow ratio (MFR). But at low off-design Mach numbers the total pressure shows an opposite trend that is maximum for the two-ramp low contraction ratio design due to lower shock loss across the external shocks similarly the MFR is higher for low contraction ratio design as the external ramp shocks move closer to the cowl. At both the angle of attack conditions and complete range of Mach numbers the total pressure recovery and mass flow ratios are highest for two ramp low contraction design due to lower stagnation pressure loss across the detached bow shock formed at the ramp and lower mass spillage. Hence, low contraction design is found to be suitable for higher off-design performance.

Keywords: internal contraction ratio, mass flow ratio, mixed compression intake, performance, supersonic flows

Procedia PDF Downloads 111
26025 Stemming the Decline of Cultural Festivals as a Way of Preserving the Nigerian Cultural Heritage: A Case Study of Kuteb and Idoma Cultural Festivals

Authors: Inalegwu Stephany Akipu

Abstract:

A cultural festival is characterized by feasting and celebration, with a day or period that has been set aside solely for this reason. Often expressed by an organized series of acts and performances, it forms a very important part of man’s cultural heritage. Nigeria is a country with many ethnic groups and diverse languages. Each of these ethnic groups has a plethora of festivals that depict their culture which is exhibited in many forms ranging from dancing to feasting and celebration. Being a very important aspect of man’s life, it is pertinent to document and optimally harness it. However, there is a significant decline of these practices in some areas in Nigeria while some areas have registered a total loss of same. It is the aim of this paper therefore, to appraise the factors responsible for this and also, to project ways of resuscitating these festivals which by the way are viable tools for revenue generation through tourism. Not only do festivals serve as a source of revenue, they also aid in national integration which in turn further enhances sustainable development. The interest of this paper will focus on the Kuteb people of Taraba State and the Idoma people of Benue State. The methodologies applied include primary (oral interviews) and secondary (consultation of written records on the subject matter) sources of data. It finally concludes by comparing the approaches that are in use by the ethnic groups in Nigeria who have successfully preserved this aspect their culture and suggestions are made as to how to apply same approaches to these two communities that form the subject of this paper.

Keywords: festival, cultural heritage, Nigeria, national integration, sustainable development

Procedia PDF Downloads 295
26024 Applicability of Overhangs for Energy Saving in Existing High-Rise Housing in Different Climates

Authors: Qiong He, S. Thomas Ng

Abstract:

Upgrading the thermal performance of building envelope of existing residential buildings is an effective way to reduce heat gain or heat loss. Overhang device is a common solution for building envelope improvement as it can cut down solar heat gain and thereby can reduce the energy used for space cooling in summer time. Despite that, overhang can increase the demand for indoor heating in winter due to its function of lowering the solar heat gain. Obviously, overhang has different impacts on energy use in different climatic zones which have different energy demand. To evaluate the impact of overhang device on building energy performance under different climates of China, an energy analysis model is built up in a computer-based simulation program known as DesignBuilder based on the data of a typical high-rise residential building. The energy simulation results show that single overhang is able to cut down around 5% of the energy consumption of the case building in the stand-alone situation or about 2% when the building is surrounded by other buildings in regions which predominantly rely on space cooling though it has no contribution to energy reduction in cold region. In regions with cold summer and cold winter, adding overhang over windows can cut down around 4% and 1.8% energy use with and without adjoining buildings, respectively. The results indicate that overhang might not an effective shading device to reduce the energy consumption in the mixed climate or cold regions.

Keywords: overhang, energy analysis, computer-based simulation, design builder, high-rise residential building, climate, BIM model

Procedia PDF Downloads 371
26023 The Impact of Coffee Consumption to Body Mass Index and Body Composition

Authors: A.L. Tamm, N. Šott, J. Jürimäe, E. Lätt, A. Orav, Ü. Parm

Abstract:

Coffee is one of the most frequently consumed beverages in the world but still its effects on human organism are not completely understood. Coffee has also been used as a method for weight loss, but its effectiveness has not been proved. There is also not similar comprehension in classifying overweight in choosing between body mass index (BMI) and fat percentage (fat%). The aim of the study was to determine associations between coffee consumption and body composition. Secondly, to detect which measure (BMI or fat%) is more accurate to use describing overweight. Altogether 103 persons enrolled the study and divided into three groups: coffee non-consumers (n=39), average coffee drinkers, who consumed 1 to 4 cups (1 cup = ca 200ml) of coffee per day (n=40) and excessive coffee consumers, who drank at least five cups of coffee per day (n=24). Body mass (medical electronic scale, A&D Instruments, Abingdon, UK) and height (Martin metal anthropometer to the nearest 0.1 cm) were measured and BMI calculated (kg/m2). Participants´ body composition was detected with dual energy X-ray absorptiometry (DXA, Hologic) and general data (history of chronic diseases included) and information about coffee consumption, and physical activity level was collected with questionnaires. Results of the study showed that excessive coffee consumption was associated with increased fat-free mass. It could be foremost due to greater physical activity level in school time or greater (not significant) male proportion in excessive coffee consumers group. For estimating the overweight the fat% in comparison to BMI recommended, as it gives more accurate results evaluating chronical disease risks. In conclusion coffee consumption probably does not affect body composition and for estimating the body composition fat% seems to be more accurate compared with BMI.

Keywords: body composition, body fat percentage, body mass index, coffee consumption

Procedia PDF Downloads 423
26022 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 136
26021 Identification of Potential Small Molecule Regulators of PERK Kinase

Authors: Ireneusz Majsterek, Dariusz Pytel, J. Alan Diehl

Abstract:

PKR-like ER kinase (PERK) is serine/threonie endoplasmic reticulum (ER) transmembrane kinase activated during ER-stress. PERK can activate signaling pathways known as unfolded protein response (UPR). Attenuation of translation is mediated by PERK via phosphorylation of eukaryotic initiation factor 2α (eIF2α), which is necessary for translation initiation. PERK activation also directly contributes to activation of Nrf2 which regulates expression of anti-oxidant enzymes. An increased phosphorylation of eIF2α has been reported in Alzheimer disease (AD) patient hippocampus, indicating that PERK is activated in this disease. Recent data have revealed activation of PERK signaling in non-Hodgkins lymphomas. Results also revealed that loss of PERK limits mammary tumor cell growth in vitro and in vivo. Consistent with these observations, activation of UPR in vitro increases levels of the amyloid precursor protein (APP), the peptide from which beta-amyloid plaques (AB) fragments are derived. Finally, proteolytic processing of APP, including the cleavages that produce AB, largely occurs in the ER, and localization coincident with PERK activity. Thus, we expect that PERK-dependent signaling is critical for progression of many types of diseases (human cancer, neurodegenerative disease and other). Therefore, modulation of PERK activity may be a useful therapeutic target in the treatment of different diseases that fail to respond to traditional chemotherapeutic strategies, including Alzheimer’s disease. Our goal will be to developed therapeutic modalities targeting PERK activity.

Keywords: PERK kinase, small molecule inhibitor, neurodegenerative disease, Alzheimer’s disease

Procedia PDF Downloads 484
26020 Challenges in Multi-Cloud Storage Systems for Mobile Devices

Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta

Abstract:

The demand for cloud storage is increasing because users want continuous access their data. Cloud Storage revolutionized the way how users access their data. A lot of cloud storage service providers are available as DropBox, G Drive, and providing limited free storage and for extra storage; users have to pay money, which will act as a burden on users. To avoid the issue of limited free storage, the concept of Multi Cloud Storage introduced. In this paper, we will discuss the limitations of existing Multi Cloud Storage systems for mobile devices.

Keywords: cloud storage, data privacy, data security, multi cloud storage, mobile devices

Procedia PDF Downloads 703
26019 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches

Authors: Wuttigrai Ngamsirijit

Abstract:

Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.    

Keywords: decision making, human capital analytics, talent management, talent value chain

Procedia PDF Downloads 191
26018 A 10 Year Review of the Complications of Ingested and Aspirated Dentures

Authors: Rory Brown, Jessica Daniels, Babatunde Oremule, William Tsang, Sadie Khwaja

Abstract:

Introduction: Dentures are common and are an intervention for both physical and psychological symptoms associated with tooth loss. However, the humble denture can cause morbidity and mortality if swallowed or aspirated. Numerous case reports document complications including hollow viscus perforation, fistula formation and airway compromise. The purpose of this review was to examine the literature documenting cases of swallowed or aspirated dentures over the past ten years to investigate factors that contribute to developing complications. Methods: A Medline literature search was performed to identify cases of denture ingestion or aspiration for over ten years. Data was collected to include patient, appliance and temporal factors that may contribute to developing complications including hollow viscus perforation, fistula formation, abscess, bowel obstruction, necrosis, hemorrhage and airway obstruction. The data was analyzed using observational and inferential statistics in the form of Chi-Squared and Pearson correlation tests. Results: Eighty-five cases of ingested or aspirated dentures were identified from 77 articles published between 1/10/2009 and 31/10/2019. Fourteen articles were excluded because they did not provide sufficient information on individual cases. Complications were documented in 37.6% of patients, and 2 cases resulted in death. There was no significant difference in complication risk based on patient age, hooked appliance, level of impaction, or radiolucency. However, symptoms of greater than 1-day duration are associated with an increased risk of complication (p=0.005). Increased time from ingestion or aspiration to removal is associated with an increased risk of complications, and the p-value remains significant up to and including day 4 (p=0.017). Conclusions: With denture use predicted to rise complications from the denture, ingestion and aspiration may become more frequent. We have demonstrated that increased symptom duration significantly increases the risk of developing complications. Additionally, we established the risk of developing complications is significantly reduced if the denture is removed with four days of aspiration or ingestion. By actively intervening early when presented with a case of swallowed or aspirated dentures, we may be able to reduce the morbidity associated with this unassuming device.

Keywords: aspiration, denture, ingestion, endoscopic foreign, body removal, foreign body impaction

Procedia PDF Downloads 142
26017 Enhancing Heavy Oil Recovery: Experimental Insights into Low Salinity Polymer in Sandstone Reservoirs

Authors: Intisar, Khalifa, Salim, Al Busaidi

Abstract:

Recently, the synergic combination of low salinity water flooding with polymer flooding has been a subject of paramount interest for the oil industry. Numerous studies have investigated the efficiency of enhanced oil recovery using low salinity polymer flooding (LSPF). However, there is no clear conclusion that can explain the incremental oil recovery, determine the main factors controlling the oil recovery process, and define the relative contribution of rock/fluids or fluid/fluid interactions to extra oil recovery. Therefore, this study aims to perform a systematic investigation of the interactions between oil, polymer, low salinity and sandstone rock surface from pore to core scale during LSPF. Partially hydrolyzed polyacrylamide (HPAM) polymer, Boise outcrop, a crude oil sample and reservoir cores from an Omani oil field, and brine at two different salinities were used in the study. Several experimental measurements including static bulk measurements of polymer solutions prepared with brines of high and low salinities, single phase displacement experiments, along with rheological, total organic carbon and ion chromatography measurements to analyze ion exchange reactions, polymer adsorption, and viscosity loss were used. In addition, two-phase experiments were performed to demonstrate the oil recovery efficiency of LSPF. The results revealed that the incremental oil recovery from LSPF was attributed to the combination of the reduction in the water-oil mobility ratio, an increase in the repulsion forces between crude oil/brine/rock interfaces and an increase in pH of the aqueous solution. In addition, lowering the salinity of the make-up brine resulted in a larger conformation (expansion) of the polymer molecules, which in turn resulted in less adsorption and a greater in-situ viscosity without any negative impact on injectivity. This plays a positive role in the oil displacement process. Moreover, the loss of viscosity in the effluent of polymer solutions was lower in low-salinity than in high-salinity brine, indicating that an increase in cations concentration (mainly driven by Ca2+ ions) has stronger effect on the viscosity of high-salinity polymer solution compared with low-salinity polymer.

Keywords: polymer, heavy oil, low salinity, COBR interactions

Procedia PDF Downloads 96
26016 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 264
26015 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format

Authors: Maryam Fallahpoor, Biswajeet Pradhan

Abstract:

Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.

Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format

Procedia PDF Downloads 94
26014 Sampled-Data Model Predictive Tracking Control for Mobile Robot

Authors: Wookyong Kwon, Sangmoon Lee

Abstract:

In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.

Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV

Procedia PDF Downloads 315
26013 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data

Authors: Nasser A. Al-Azri

Abstract:

The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.

Keywords: bioclimatic charts, passive cooling, TMY, weather data

Procedia PDF Downloads 243
26012 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach

Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon

Abstract:

Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.

Keywords: data mining, defensive m&s, management system, knowledge management

Procedia PDF Downloads 260
26011 Khaya Cellulose Supported Copper Nanoparticles for Chemo Selective Aza-Michael Reactions

Authors: M. Shaheen Sarkar, M. Lutfor Rahman, Mashitah Mohd Yusoff

Abstract:

We prepared a highly active Khaya cellulose supported poly(hydroxamic acid) copper nanoparticles by the surface modification of Khaya cellulose through graft co-polymerization and subsequently amidoximation. The Cu-nanoparticle (0.05 mol% to 50 mol ppm) was selectively promoted Aza-Michael reaction of aliphatic amines to give the corresponding alkylated products at room temperature in methanol. The supported nanoparticle was easy to recover and reused seven times without significance loss of its activity.

Keywords: Aza-Michael, copper, cellulose, nanoparticles, poly(hydroxamic acid)

Procedia PDF Downloads 346
26010 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 241
26009 Examination of Teacher Candidates Attitudes Towards Disabled Individuals Employment in terms of Various Variables

Authors: Tuna Şahsuvaroğlu

Abstract:

The concept of disability is a concept that has been the subject of many studies in national and international literature with its social, sociological, political, anthropological, economic and social dimensions as well as with individual and social consequences. A disabled person is defined as a person who has difficulties in adapting to social life and meeting daily needs due to loss of physical, mental, spiritual, sensory and social abilities to various degrees, either from birth or for any reason later, and they are in need of protection, care, rehabilitation, counseling and support services. The industrial revolution and the rapid industrialization it brought with it led to an increase in the rate of disabilities resulting from work accidents, in addition to congenital disabilities. This increase has resulted in disabled people included in the employment policies of nations as a disadvantaged group. Although the participation of disabled individuals in the workforce is of great importance in terms of both increasing their quality of life and their integration with society and although disabled individuals are willing to participate in the workforce, they encounter with many problems. One of these problems is the negative attitudes and prejudices that develop in society towards the employment of disabled individuals. One of the most powerful ways to turn these negative attitudes and prejudices into positive ones is education. Education is a way of guiding societies and transferring existing social characteristics to future generations. This can be maintained thanks to teachers, who are one of the most dynamic parts of society and act as the locomotive of education driven by the need to give direction and transfer and basically to help and teach. For this reason, there is a strong relationship between the teaching profession and the attitudes formed in society towards the employment of disabled individuals, as they can influence each other. Therefore, the purpose of this study is to examine teacher candidates' attitudes towards the employment of disabled individuals in terms of various variables. The participants of the study consist of 665 teacher candidates studying at various departments at Marmara University Faculty of Education in the 2022-2023 academic year. The descriptive survey model of the general survey model was used in this study as it intends to determine the attitudes of teacher candidates towards the employment of disabled individuals in terms of different variables. The Attitude Scale Towards Employment of Disabled People was used to collect data. The data were analyzed according to the variables of age, gender, marital status, the department, and whether there is a disabled relative in the family, and the findings were discussed in the context of further research.

Keywords: teacher candidates, disabled, attitudes towards the employment of disabled people, attitude scale towards the employment of disabled people

Procedia PDF Downloads 71
26008 Imputation of Urban Movement Patterns Using Big Data

Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson

Abstract:

Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.

Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population

Procedia PDF Downloads 232
26007 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 45
26006 The Influence of Housing Choice Vouchers on the Private Rental Market

Authors: Randy D. Colon

Abstract:

Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.

Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market

Procedia PDF Downloads 123
26005 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 67
26004 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation

Procedia PDF Downloads 342
26003 Social Data Aggregator and Locator of Knowledge (STALK)

Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat

Abstract:

Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.

Keywords: social network, analysis, Facebook, Linkedin, git, big data

Procedia PDF Downloads 446
26002 Rasagiline Improves Metabolic Function and Reduces Tissue Injury in the Substantia Nigra in Parkinson's Disease: A Longitudinal In-Vivo Advanced MRI Study

Authors: Omar Khan, Shana Krstevska, Edwin George, Veronica Gorden, Fen Bao, Christina Caon, NP-C, Carla Santiago, Imad Zak, Navid Seraji-Bozorgzad

Abstract:

Objective: To quantify cellular injury in the substantia nigra (SN) in patients with Parkinson's disease (PD) and to examine the effect of rasagiline of tissue injury in the SN in patients with PD. Background: N-acetylaspartate (NAA) quantified with MRS is a reliable marker of neuronal metabolic function. Fractional anisotropy (FA) and mean diffusivity (MD) obtained with DTI, characterize tissue alignment and integrity. Rasagline, has been shown to exert anti-apototic effect. We applied these advanced MRI techniques to examine: (i) the effect of rasagiline on cellular injury and metabolism in patients with early PD, and (ii) longitudinal changes seen over time in PD. Methods: We conducted a prospective longitudinal study in patients with mild PD, naive to dopaminergic treatment. The imaging protocol included multi-voxel proton-MRS and DTI of the SN, acquired on a 3T scanner. Scans were performed at baseline and month 3, during which the patient was on no treatment. At that point, rasagiline 1 mg orally daily was initiated and MRI scans are were obtained at 6 and 12 months after starting rasagiline. The primary objective was to compare changes during the 3-month period of “no treatment” to the changes observed “on treatment” with rasagiline at month 12. Age-matched healthy controls were also imaged. Image analysis was performed blinded to treatment allocation and period. Results: 25 patients were enrolled in this study. Compared to the period of “no treatment”, there was significant increase in the NAA “on treatment” period (-3.04 % vs +10.95 %, p= 0.0006). Compared to the period of “no treatment”, there was significant increase in following 12 month in the FA “on treatment” (-4.8% vs +15.3%, p<0.0001). The MD increased during “no treatment” and decreased in “on treatment” (+2.8% vs -7.5%, p=0.0056). Further analysis and clinical correlation are ongoing. Conclusions: Advanced MRI techniques quantifying cellular injury in the SN in PD is a feasible approach to investigate dopaminergic neuronal injury and could be developed as an outcome in exploratory studies. Rasagiline appears to have a stabilizing effect on dopaminergic cell loss and metabolism in the SN in PD, that warrants further investigation in long-term studies.

Keywords: substantia nigra, Parkinson's disease, MRI, neuronal loss, biomarker

Procedia PDF Downloads 320
26001 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 226
26000 BIASS in the Estimation of Covariance Matrices and Optimality Criteria

Authors: Juan M. Rodriguez-Diaz

Abstract:

The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.

Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix

Procedia PDF Downloads 448