Search results for: electronic data interchange
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26299

Search results for: electronic data interchange

24739 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 260
24738 Integrated Model for Enhancing Data Security Performance in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 477
24737 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 133
24736 Dynamic Contrast-Enhanced Breast MRI Examinations: Clinical Use and Technical Challenges

Authors: Janet Wing-Chong Wai, Alex Chiu-Wing Lee, Hailey Hoi-Ching Tsang, Jeffrey Chiu, Kwok-Wing Tang

Abstract:

Background: Mammography has limited sensitivity and specificity though it is the primary imaging technique for detection of early breast cancer. Ultrasound imaging and contrast-enhanced MRI are useful adjunct tools to mammography. The advantage of breast MRI is high sensitivity for invasive breast cancer. Therefore, indications for and use of breast magnetic resonance imaging have increased over the past decade. Objectives: 1. Cases demonstration on different indications for breast MR imaging. 2. To review of the common artifacts and pitfalls in breast MR imaging. Materials and Methods: This is a retrospective study including all patients underwent dynamic contrast-enhanced breast MRI examination in our centre, performed from Jan 2011 to Dec 2017. The clinical data and radiological images were retrieved from the EPR (electronic patient record), RIS (Radiology Information System) and PACS (Picture Archiving and Communication System). Results and Discussion: Cases including (1) Screening of the contralateral breast in patient with a new breast malignancy (2) Breast augmentation with free injection of unknown foreign materials (3) Finding of axillary adenopathy with an unknown site of primary malignancy (4) Neo-adjuvant chemotherapy: before, during, and after chemotherapy to evaluate treatment response and extent of residual disease prior to operation. Relevant images will be included and illustrated in the presentation. As with other types of MR imaging, there are different artifacts and pitfalls that can potentially limit interpretation of the images. Because of the coils and software specific to breast MR imaging, there are some other technical considerations that are unique to MR imaging of breast regions. Case demonstration images will be available in presentation. Conclusion: Breast MR imaging is a highly sensitive and reasonably specific method for the detection of breast cancer. Adherent to appropriate clinical indications and technical optimization are crucial for achieving satisfactory images for interpretation.

Keywords: MRI, breast, clinical, cancer

Procedia PDF Downloads 241
24735 Challenges in Multi-Cloud Storage Systems for Mobile Devices

Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta

Abstract:

The demand for cloud storage is increasing because users want continuous access their data. Cloud Storage revolutionized the way how users access their data. A lot of cloud storage service providers are available as DropBox, G Drive, and providing limited free storage and for extra storage; users have to pay money, which will act as a burden on users. To avoid the issue of limited free storage, the concept of Multi Cloud Storage introduced. In this paper, we will discuss the limitations of existing Multi Cloud Storage systems for mobile devices.

Keywords: cloud storage, data privacy, data security, multi cloud storage, mobile devices

Procedia PDF Downloads 699
24734 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches

Authors: Wuttigrai Ngamsirijit

Abstract:

Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.    

Keywords: decision making, human capital analytics, talent management, talent value chain

Procedia PDF Downloads 187
24733 X-Ray Detector Technology Optimization In CT Imaging

Authors: Aziz Ikhlef

Abstract:

Most of multi-slices CT scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80kVp and 140kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.

Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts

Procedia PDF Downloads 271
24732 Empowering Learners: From Augmented Reality to Shared Leadership

Authors: Vilma Zydziunaite, Monika Kelpsiene

Abstract:

In early childhood and preschool education, play has an important role in learning and cognitive processes. In the context of a changing world, personal autonomy and the use of technology are becoming increasingly important for the development of a wide range of learner competencies. By integrating technology into learning environments, the educational reality is changed, promoting unusual learning experiences for children through play-based activities. Alongside this, teachers are challenged to develop encouragement and motivation strategies that empower children to act independently. The aim of the study was to reveal the changes in the roles and experiences of teachers in the application of AR technology for the enrichment of the learning process. A quantitative research approach was used to conduct the study. The data was collected through an electronic questionnaire. Participants: 319 teachers of 5-6-year-old children using AR technology tools in their educational process. Methods of data analysis: Cronbach alpha, descriptive statistical analysis, normal distribution analysis, correlation analysis, regression analysis (SPSS software). Results. The results of the study show a significant relationship between children's learning and the educational process modeled by the teacher. The strongest predictor of child learning was found to be related to the role of the educator. Other predictors, such as pedagogical strategies, the concept of AR technology, and areas of children's education, have no significant relationship with child learning. The role of the educator was found to be a strong determinant of the child's learning process. Conclusions. The greatest potential for integrating AR technology into the teaching-learning process is revealed in collaborative learning. Teachers identified that when integrating AR technology into the educational process, they encourage children to learn from each other, develop problem-solving skills, and create inclusive learning contexts. A significant relationship has emerged - how the changing role of the teacher relates to the child's learning style and the aspiration for personal leadership and responsibility for their learning. Teachers identified the following key roles: observer of the learning process, proactive moderator, and creator of the educational context. All these roles enable the learner to become an autonomous and active participant in the learning process. This provides a better understanding and explanation of why it becomes crucial to empower the learner to experiment, explore, discover, actively create, and foster collaborative learning in the design and implementation of the educational content, also for teachers to integrate AR technologies and the application of the principles of shared leadership. No statistically significant relationship was found between the understanding of the definition of AR technology and the teacher’s choice of role in the learning process. However, teachers reported that their understanding of the definition of AR technology influences their choice of role, which has an impact on children's learning.

Keywords: teacher, learner, augmented reality, collaboration, shared leadership, preschool education

Procedia PDF Downloads 40
24731 A Comparative, Epidemiological Study of Acute Renal Colic Presentations to Major Academic Emergency Departments in Doha, Qatar and Melbourne, Australia

Authors: Sameer A. Pathan, Biswadev Mitra, Zain A. Bhutta, Isma Qureshi, Elle Spencer, Asmaa A. Hameed, Sana Nadeem, Ramsha Tahir, Shahzad Anjum, Peter A. Cameron

Abstract:

Background: This study aimed to compare epidemiology, clinical presentations, management and outcomes of renal colic presentations in two major academic centers and discuss potential implications of these results for the applicability of current evidence in the management of renal colic. Methods: We undertook a retrospective cohort study of patients with renal colic who presented to the Hamad General Hospital Emergency Department (HGH-ED), Qatar, and The Alfred ED, Melbourne, Australia, during a period of one year from August 1, 2012, to July 3, 2013. Cases were identified using ICD-9-CM codes, and an electronic template was used to record the data on predefined clinical variables. Results: A total of 12,223 from the HGH-ED and 384 from The Alfred ED were identified as renal colic presentations during the study period. The rate of renal colic presentations at the HGH-ED was 27.9 per 1000 ED visits compared to 6.7 per 1000 ED visits at The Alfred ED. Patients presenting to the HGH-ED were significantly younger [34.9 years (29.0- 43.4) than The Alfred ED [48 years (37-60); P < 0.001]. The median stone size was larger in the HGH-ED group [6 (4-8) mm] versus The Alfred ED group [4 (3-6) mm, P < 0.001]. The intervention rate in the stone-positive population was significantly higher in the HGH-ED group as opposed to The Alfred ED group (38.7% versus 11.9%, p<0.001). At the time of discharge, The Alfred ED group received less analgesic prescriptions (55.8% versus 83.5%, P < 0.001) and more tamsulosin prescriptions (25.3% versus 11.7%, P < 0.001). Conclusions: Renal colic presentations to the HGH-ED, Qatar, were younger, with larger stone size, compared to The Alfred ED, whereas, medical expulsion therapy use was higher at the Alfred ED. Differences in epidemiology should be considered while tailoring strategies for effective management of patients with renal colic in the given setting.

Keywords: kidney stones, urolithiasis, nephrolithiasis, renal colic, epidemiology

Procedia PDF Downloads 241
24730 OMTHD Strategy in Asymmetrical Seven-Level Inverter for High Power Induction Motor

Authors: Rachid Taleb, M’hamed Helaimi, Djilali Benyoucef, Ahmed Derrouazin

Abstract:

Multilevel inverters are well used in high power electronic applications because of their ability to generate a very good quality of waveforms, reducing switching frequency, and their low voltage stress across the power devices. This paper presents the Optimal Minimization of the Total Harmonic Distortion (OMTHD) strategy of a uniform step asymmetrical seven-level inverter (USA7LI). The OMTHD approach is compared to the well-known sinusoidal pulse-width modulation (SPWM) strategy. Simulation results demonstrate the better performances and technical advantages of the OMTHD controller in feeding a High Power Induction Motor (HPIM).

Keywords: uniform step asymmetrical seven-level inverter (USA7LI), optimal minimization of the THD (OMTHD), sinusoidal PWM (SPWM), high power induction motor (HPIM)

Procedia PDF Downloads 589
24729 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 259
24728 Reduction of High-Frequency Planar Transformer Conduction Losses Using a Planar Litz Wire Structure

Authors: Hamed Belloumi, Amira Zouaoui, Ferid kourda

Abstract:

A new trend in power converters is to design planar transformer that aim for low profile. However, at high frequency, the planar transformer ac losses become significant due to the proximity and skin effects. In this paper, the design and implementation of a novel planar Litz conductor is presented in order to equalize the flux linkage and improving the current distribution. The developed PCB litz wire structure minimizes the losses in a similar way to the conventional multi stranded Litz wires. In order to further illustrate the eddy current effect in different arrangements, a Finite-Element Analysis (FEA) tool is used to analyze current distribution inside the conductors. Finally, the proposed planar transformer has been integrated in an electronic stage to test at high signal levels.

Keywords: planar transformer, finite-element analysis, winding losses, planar Litz wire

Procedia PDF Downloads 400
24727 Study of Education Learning Techniques and Game Genres

Authors: Khadija Al Farei, Prakash Kumar, Vikas Rao Naidu

Abstract:

Games are being developed with different genres for different age groups, for many decades. In many places, educational games are playing a vital role for active classroom environment and better learning among students. Currently, the educational games have assumed an important place in children and teenagers lives. The role of educational games is important for improving the learning capability among the students especially of this generation, who really live among electronic gadgets. Hence, it is now important to make sure that in our educational system, we are updated with all such advancement in technologies. Already much research is going on in this area of edutainment. This research paper will review around ten different research papers to find the relation between the education learning techniques and games. The result of this review provides guidelines for enhanced teaching and learning solutions in education. In-house developed educational games proved to be more effective, compared to the one which is readily available in the market.

Keywords: education, education game, educational technology, edutainment, game genres, gaming in education

Procedia PDF Downloads 415
24726 Semiconductor Nanofilm Based Schottky-Barrier Solar Cells

Authors: Mariyappan Shanmugam, Bin Yu

Abstract:

Schottky-barrier solar cells are demonstrated employing 2D-layered MoS2 and WS2 semiconductor nanofilms as photo-active material candidates synthesized by chemical vapor deposition method. Large area MoS2 and WS2 nanofilms are stacked by layer transfer process to achieve thicker photo-active material studied by atomic force microscopy showing a thickness in the range of ~200 nm. Two major vibrational active modes associated with 2D-layered MoS2 and WS2 are studied by Raman spectroscopic technique to estimate the quality of the nanofilms. Schottky-barrier solar cells employed MoS2 and WS2 active materials exhibited photoconversion efficiency of 1.8 % and 1.7 % respectively. Fermi-level pinning at metal/semiconductor interface, electronic transport and possible recombination mechanisms are studied in the Schottky-barrier solar cells.

Keywords: two-dimensional nanosheet, graphene, hexagonal boron nitride, solar cell, Schottky barrier

Procedia PDF Downloads 330
24725 Sampled-Data Model Predictive Tracking Control for Mobile Robot

Authors: Wookyong Kwon, Sangmoon Lee

Abstract:

In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.

Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV

Procedia PDF Downloads 309
24724 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data

Authors: Nasser A. Al-Azri

Abstract:

The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.

Keywords: bioclimatic charts, passive cooling, TMY, weather data

Procedia PDF Downloads 240
24723 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach

Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon

Abstract:

Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.

Keywords: data mining, defensive m&s, management system, knowledge management

Procedia PDF Downloads 255
24722 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 236
24721 Imputation of Urban Movement Patterns Using Big Data

Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson

Abstract:

Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.

Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population

Procedia PDF Downloads 231
24720 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 39
24719 The Influence of Housing Choice Vouchers on the Private Rental Market

Authors: Randy D. Colon

Abstract:

Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.

Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market

Procedia PDF Downloads 118
24718 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 62
24717 Clinical Advice Services: Using Lean Chassis to Optimize Nurse-Driven Telephonic Triage of After-Hour Calls from Patients

Authors: Eric Lee G. Escobedo-Wu, Nidhi Rohatgi, Fouzel Dhebar

Abstract:

It is challenging for patients to navigate through healthcare systems after-hours. This leads to delays in care, patient/provider dissatisfaction, inappropriate resource utilization, readmissions, and higher costs. It is important to provide patients and providers with effective clinical decision-making tools to allow seamless connectivity and coordinated care. In August 2015, patient-centric Stanford Health Care established Clinical Advice Services (CAS) to provide clinical decision support after-hours. CAS is founded on key Lean principles: Value stream mapping, empathy mapping, waste walk, takt time calculations, standard work, plan-do-check-act cycles, and active daily management. At CAS, Clinical Assistants take the initial call and manage all non-clinical calls (e.g., appointments, directions, general information). If the patient has a clinical symptom, the CAS nurses take the call and utilize standardized clinical algorithms to triage the patient to home, clinic, urgent care, emergency department, or 911. Nurses may also contact the on-call physician based on the clinical algorithm for further direction and consultation. Since August 2015, CAS has managed 228,990 calls from 26 clinical specialties. Reporting is built into the electronic health record for analysis and data collection. 65.3% of the after-hours calls are clinically related. Average clinical algorithm adherence rate has been 92%. An average of 9% of calls was escalated by CAS nurses to the physician on call. An average of 5% of patients was triaged to the Emergency Department by CAS. Key learnings indicate that a seamless connectivity vision, cascading, multidisciplinary ownership of the problem, and synergistic enterprise improvements have contributed to this success while striving for continuous improvement.

Keywords: after hours phone calls, clinical advice services, nurse triage, Stanford Health Care

Procedia PDF Downloads 174
24716 Tourist Behavior Towards Blockchain-Based Payments

Authors: A. Šapkauskienė, A. Mačerinskienė, R. Andrulienė, R. Bruzgė, S. Masteika, K. Driaunys

Abstract:

The COVID-19 pandemic has affected not only world markets and economies but also the daily lives of customers and their payment habits. The pandemic has accelerated the digital transformation, so the role of technology will become even more important post-COVID. Although the popularity of cryptocurrencies has reached unprecedented heights, there are still obstacles, such as a lack of consumer experience and distrust of these technologies, so exploring the role of cryptocurrency and blockchain in the context of international travel becomes extremely important. Research on tourists’ intentions to use cryptocurrencies for payment purposes is limited due to the small number of research studies. To fill this research gap, an exploratory study based on the analysis of survey data was conducted. The purpose of the research is to explore how the behavior of tourists has changed making their financial transactions when paying for the tourism services in order to determine the intention to pay in cryptocurrencies. Behavioral intention can be examined as a dependent variable that is useful for the study of the acceptance of blockchain as cutting-edge technology. Therefore, this study examines the intention of travelers to use cryptocurrencies in electronic payments for tourism services. Several studies have shown that the intention to accept payments in a cryptocurrency is affected by the perceived usefulness of these payments and the perceived ease of use. The findings deepen our understanding of the readiness of service users to apply for blockchain-based payment in the tourism sector. The tourism industry has to focus not only on the technology but on consumers who can use cryptocurrencies, creating new possibilities and increasing business competitiveness. Based on research results, suggestions are made to guide future research on the use of cryptocurrencies by tourists in the tourism industry. Therefore, in line with the rapid expansion of virtual currency users, market capitalization, and payment in cryptographic currencies, it is necessary to explore the possibilities of implementing a blockchain-based system aiming to promote the use of services in the tourism sector as the most affected by the pandemic.

Keywords: behavioral intention, blockchain-based payment, cryptocurrency, tourism

Procedia PDF Downloads 105
24715 Execution Time Optimization of Workflow Network with Activity Lead-Time

Authors: Xiaoping Qiu, Binci You, Yue Hu

Abstract:

The executive time of the workflow network has an important effect on the efficiency of the business process. In this paper, the activity executive time is divided into the service time and the waiting time, then the lead time can be extracted from the waiting time. The executive time formulas of the three basic structures in the workflow network are deduced based on the activity lead time. Taken the process of e-commerce logistics as an example, insert appropriate lead time for key activities by using Petri net, and the executive time optimization model is built to minimize the waiting time with the time-cost constraints. Then the solution program-using VC++6.0 is compiled to get the optimal solution, which reduces the waiting time of key activities in the workflow, and verifies the role of lead time in the timeliness of e-commerce logistics.

Keywords: electronic business, execution time, lead time, optimization model, petri net, time workflow network

Procedia PDF Downloads 176
24714 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns

Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim

Abstract:

Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.

Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation

Procedia PDF Downloads 340
24713 Social Data Aggregator and Locator of Knowledge (STALK)

Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat

Abstract:

Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.

Keywords: social network, analysis, Facebook, Linkedin, git, big data

Procedia PDF Downloads 444
24712 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates

Authors: Rima Shishakly, Mervyn Misajon

Abstract:

Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.

Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)

Procedia PDF Downloads 222
24711 X-Ray Detector Technology Optimization in Computed Tomography

Authors: Aziz Ikhlef

Abstract:

Most of multi-slices Computed Tomography (CT) scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This is translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80 kVp and 140 kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.

Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts

Procedia PDF Downloads 194
24710 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm

Authors: Kamel Belammi, Houria Fatrim

Abstract:

imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.

Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes

Procedia PDF Downloads 532