Search results for: violation data discovery
24457 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning
Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul
Abstract:
In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.Keywords: electrocardiogram, dictionary learning, sparse coding, classification
Procedia PDF Downloads 38624456 Place of Radiotherapy in the Treatment of Intracranial Meningiomas: Experience of the Cancer Center Emir Abdelkader of Oran Algeria
Authors: Taleb L., Benarbia M., Boutira F. M., Allam H., Boukerche A.
Abstract:
Introduction and purpose of the study: Meningiomas are the most common non-glial intracranial tumors in adults, accounting for approximately 30% of all central nervous system tumors. The aim of our study is to determine the epidemiological, clinical, therapeutic, and evolutionary characteristics of a cohort of patients with intracranial meningioma treated with radiotherapy at the Emir Abdelkader Cancer Center in Oran. Material and methods: This is a retrospective study of 44 patients during the period from 2014 to 2020. The overall survival and relapse-free survival curves were calculated using the Kaplan-Meier method. Results and statistical analysis: The median age of the patients was 49 years [21-76 years] with a clear female predominance (sex ratio=2.4). The average diagnostic delay was seven months [2 to 24 months], the circumstances of the discovery of which were dominated by headaches in 54.5% of cases (n=24), visual disturbances in 40.9% (n=18), and motor disorders in 15.9% (n=7). The seat of the tumor was essentially at the level of the base of the skull in 52.3% of patients (n=23), including 29.5% (n=13) at the level of the cavernous sinus, 27.3% (n=12) at the parasagittal level and 20.5% (n=9) at the convexity. The diagnosis was confirmed surgically in 36 patients (81.8%) whose anatomopathological study returned in favor of grades I, II, and III in respectively 40.9%, 29.5%, and 11.4% of the cases. Radiotherapy was indicated postoperatively in 45.5% of patients (n=20), exclusive in 27.3% (n=12) and after tumor recurrence in 27.3% of cases (n=18). The irradiation doses delivered were as follows: 50 Gy (20.5%), 54 Gy (65.9%), and 60 Gy (13.6%). With a median follow-up of 69 months, the probabilities of relapse-free survival and overall survival at three years are 93.2% and 95.4%, respectively, whereas they are 71.2% and 80.7% at five years. Conclusion: Meningiomas are common primary brain tumors. Most often benign but can also progress aggressively. Their treatment is essentially surgical, but radiotherapy retains its place in specific situations, allowing good tumor control and overall survival.Keywords: diagnosis, meningioma, surgery, radiotherapy, survival
Procedia PDF Downloads 10124455 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data
Authors: Qiuxiao Chen, Yan Hou, Ning Wu
Abstract:
As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost
Procedia PDF Downloads 25224454 Multimedia Container for Autonomous Car
Authors: Janusz Bobulski, Mariusz Kubanek
Abstract:
The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.Keywords: an autonomous car, image processing, lidar, obstacle detection
Procedia PDF Downloads 22724453 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm
Authors: Monojit Manna, Arpan Adhikary
Abstract:
In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection
Procedia PDF Downloads 7824452 A Strategic Sustainability Analysis of Electric Vehicles in EU Today and Towards 2050
Authors: Sven Borén, Henrik Ny
Abstract:
Ambitions within the EU for moving towards sustainable transport include major emission reductions for fossil fuel road vehicles, especially for buses, trucks, and cars. The electric driveline seems to be an attractive solution for such development. This study first applied the Framework for Strategic Sustainable Development to compare sustainability effects of today’s fossil fuel vehicles with electric vehicles that have batteries or hydrogen fuel cells. The study then addressed a scenario were electric vehicles might be in majority in Europe by 2050. The methodology called Strategic Lifecycle Assessment was first used, were each life cycle phase was assessed for violations against sustainability principles. This indicates where further analysis could be done in order to quantify the magnitude of each violation, and later to create alternative strategies and actions that lead towards sustainability. A Life Cycle Assessment of combustion engine cars, plug-in hybrid cars, battery electric cars and hydrogen fuel cell cars was then conducted to compare and quantify environmental impacts. The authors found major violations of sustainability principles like use of fossil fuels, which contribute to the increase of emission related impacts such as climate change, acidification, eutrophication, ozone depletion, and particulate matters. Other violations were found, such as use of scarce materials for batteries and fuel cells, and also for most life cycle phases for all vehicles when using fossil fuel vehicles for mining, production and transport. Still, the studied current battery and hydrogen fuel cell cars have less severe violations than fossil fuel cars. The life cycle assessment revealed that fossil fuel cars have overall considerably higher environmental impacts compared to electric cars as long as the latter are powered by renewable electricity. By 2050, there will likely be even more sustainable alternatives than the studied electric vehicles when the EU electricity mix mainly should stem from renewable sources, batteries should be recycled, fuel cells should be a mature technology for use in vehicles (containing no scarce materials), and electric drivelines should have replaced combustion engines in other sectors. An uncertainty for fuel cells in 2050 is whether the production of hydrogen will have had time to switch to renewable resources. If so, that would contribute even more to a sustainable development. Except for being adopted in the GreenCharge roadmap, the authors suggest that the results can contribute to planning in the upcoming decades for a sustainable increase of EVs in Europe, and potentially serve as an inspiration for other smaller or larger regions. Further studies could map the environmental effects in LCA further, and include other road vehicles to get a more precise perception of how much they could affect sustainable development.Keywords: strategic, electric vehicles, sustainability, LCA
Procedia PDF Downloads 38724451 In silico Subtractive Genomics Approach for Identification of Strain-Specific Putative Drug Targets among Hypothetical Proteins of Drug-Resistant Klebsiella pneumoniae Strain 825795-1
Authors: Umairah Natasya Binti Mohd Omeershffudin, Suresh Kumar
Abstract:
Klebsiella pneumoniae, a Gram-negative enteric bacterium that causes nosocomial and urinary tract infections. Particular concern is the global emergence of multidrug-resistant (MDR) strains of Klebsiella pneumoniae. Characterization of antibiotic resistance determinants at the genomic level plays a critical role in understanding, and potentially controlling, the spread of multidrug-resistant (MDR) pathogens. In this study, drug-resistant Klebsiella pneumoniae strain 825795-1 was investigated with extensive computational approaches aimed at identifying novel drug targets among hypothetical proteins. We have analyzed 1099 hypothetical proteins available in genome. We have used in-silico genome subtraction methodology to design potential and pathogen-specific drug targets against Klebsiella pneumoniae. We employed bioinformatics tools to subtract the strain-specific paralogous and host-specific homologous sequences from the bacterial proteome. The sorted 645 proteins were further refined to identify the essential genes in the pathogenic bacterium using the database of essential genes (DEG). We found 135 unique essential proteins in the target proteome that could be utilized as novel targets to design newer drugs. Further, we identified 49 cytoplasmic protein as potential drug targets through sub-cellular localization prediction. Further, we investigated these proteins in the DrugBank databases, and 11 of the unique essential proteins showed druggability according to the FDA approved drug bank databases with diverse broad-spectrum property. The results of this study will facilitate discovery of new drugs against Klebsiella pneumoniae.Keywords: pneumonia, drug target, hypothetical protein, subtractive genomics
Procedia PDF Downloads 17724450 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations
Authors: Ramon Santana
Abstract:
The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.Keywords: fingerprint, template protection, bio-cryptography, minutiae protection
Procedia PDF Downloads 17024449 Improving Digital Data Security Awareness among Teacher Candidates with Digital Storytelling Technique
Authors: Veysel Çelik, Aynur Aker, Ebru Güç
Abstract:
Developments in information and communication technologies have increased both the speed of producing information and the speed of accessing new information. Accordingly, the daily lives of individuals have started to change. New concepts such as e-mail, e-government, e-school, e-signature have emerged. For this reason, prospective teachers who will be future teachers or school administrators are expected to have a high awareness of digital data security. The aim of this study is to reveal the effect of the digital storytelling technique on the data security awareness of pre-service teachers of computer and instructional technology education departments. For this purpose, participants were selected based on the principle of volunteering among third-grade students studying at the Computer and Instructional Technologies Department of the Faculty of Education at Siirt University. In the research, the pretest/posttest half experimental research model, one of the experimental research models, was used. In this framework, a 6-week lesson plan on digital data security awareness was prepared in accordance with the digital narration technique. Students in the experimental group formed groups of 3-6 people among themselves. The groups were asked to prepare short videos or animations for digital data security awareness. The completed videos were watched and evaluated together with prospective teachers during the evaluation process, which lasted approximately 2 hours. In the research, both quantitative and qualitative data collection tools were used by using the digital data security awareness scale and the semi-structured interview form consisting of open-ended questions developed by the researchers. According to the data obtained, it was seen that the digital storytelling technique was effective in creating data security awareness and creating permanent behavior changes for computer and instructional technology students.Keywords: digital storytelling, self-regulation, digital data security, teacher candidates, self-efficacy
Procedia PDF Downloads 12724448 A Remote Sensing Approach to Calculate Population Using Roads Network Data in Lebanon
Authors: Kamel Allaw, Jocelyne Adjizian Gerard, Makram Chehayeb, Nada Badaro Saliba
Abstract:
In developing countries, such as Lebanon, the demographic data are hardly available due to the absence of the mechanization of population system. The aim of this study is to evaluate, using only remote sensing data, the correlations between the number of population and the characteristics of roads network (length of primary roads, length of secondary roads, total length of roads, density and percentage of roads and the number of intersections). In order to find the influence of the different factors on the demographic data, we studied the degree of correlation between each factor and the number of population. The results of this study have shown a strong correlation between the number of population and the density of roads and the number of intersections.Keywords: population, road network, statistical correlations, remote sensing
Procedia PDF Downloads 16324447 A Multicopy Strategy for Improved Security Wireless Sensor Network
Authors: Tuğçe Yücel
Abstract:
A Wireless Sensor Network(WSN) is a collection of sensor nodes which are deployed randomly in an area for surveillance. Efficient utilization of limited battery energy of sensors for increased network lifetime as well as data security are major design objectives for WSN. Moreover secure transmission of data sensed to a base station for further processing. Producing multiple copies of data packets and sending them on different paths is one of the strategies for this purpose, which leads to redundant energy consumption and hence reduced network lifetime. In this work we develop a restricted multi-copy multipath strategy where data move through ‘frequently’ or ‘heavily’ used sensors is copied by the sensor incident to such central nodes and sent on node-disjoint paths. We develop a mixed integer programing(MIP) model and heuristic approach present some preleminary test results.Keywords: MIP, sensor, telecommunications, WSN
Procedia PDF Downloads 51224446 The Nexus between Migration and Human Security: The Case of Ethiopian Female Migration to Sudan
Authors: Anwar Hassen Tsega
Abstract:
International labor migration is an integral part of the modern globalized world. However, the phenomenon has its roots in some earlier periods in human history. This paper discusses the relatively new phenomenon of female migration in Africa. In the past, African women migrants were only spouses or dependent family members. But as modernity swept most African societies, with rising unemployment rates, there is evidence everywhere in Africa that women labor migration is a growing phenomenon that deserves to be understood in the context of human security research. This work explores these issues further, focusing on the experience of Ethiopian women labor migrants to Sudan. The migration of Ethiopian people to Sudan is historical; nevertheless, labor migration mainly started since the discovery and subsequent exploration of oil in the Sudan. While the paper is concerned with the human security aspect of the migrant workers, we need to be certain that the migration process will provide with a decent wage, good working conditions, the necessary social security coverage, and labor protection as a whole. However, migration to Sudan is not always safe and female migrants become subject to violence at the hands of brokers, employers and migration officials. For this matter, the paper argued that identifying the vulnerable stages and major problem facing female migrant workers at various stages of migration is a prerequisite to combat the problem and secure the lives of the migrant workers. The major problems female migrants face include extra degrees of gender-based violence, underpayment, various forms of abuse like verbal, physical and sexual and other forms of torture which include beating and slaps. This peculiar situation could be attributed to the fact that most of these women are irregular migrants and fall under the category of unskilled and/or illiterate migrants.Keywords: Ethiopia, human security, labor migration, Sudan
Procedia PDF Downloads 25124445 Wikipedia World: A Computerized Process for Cultural Heritage Data Dissemination
Authors: L. Rajaonarivo, M. N. Bessagnet, C. Sallaberry, A. Le Parc Lacayrelle, L. Leveque
Abstract:
TCVPYR is a European FEDER (European Regional Development Fund) project which aims to promote tourism in the French Pyrenees region by leveraging its cultural heritage. It involves scientists from various domains (geographers, historians, anthropologists, computer scientists...). This paper presents a fully automated process to publish any dataset as Wikipedia articles as well as the corresponding linked information on Wikidata and Wikimedia Commons. We validate this process on a sample of geo-referenced cultural heritage data collected by TCVPYR researchers in different regions of the Pyrenees. The main result concerns the technological prerequisites, which are now in place. Moreover, we demonstrated that we can automatically publish cultural heritage data on Wikimedia.Keywords: cultural heritage dissemination, digital humanities, open data, Wikimedia automated publishing
Procedia PDF Downloads 12724444 Increasing Creativity in Virtual Learning Space for Developing Creative Cities
Authors: Elham Fariborzi, Hoda Anvari Kazemabad
Abstract:
Today, ICT plays an important role in all matters and it affects the development of creative cities. According to virtual space in this technology, it use especially for expand terms like smart schools, Virtual University, web-based training and virtual classrooms that is in parallel with the traditional teaching. Nowadays, the educational systems in different countries such as Iran are changing and start increasing creativity in the learning environment. It will contribute to the development of innovative ideas and thinking of the people in this environment; such opportunities might be cause scientific discovery and development issues. The creativity means the ability to generate ideas and numerous, new and suitable solutions for solving the problems of real and virtual individuals and society, which can play a significant role in the development of creative current physical cities or virtual borders ones in the future. The purpose of this paper is to study strategies to increase creativity in a virtual learning to develop a creative city. In this paper, citation/ library study was used. The full description given in the text, including how to create and enhance learning creativity in a virtual classroom by reflecting on performance and progress; attention to self-directed learning guidelines, efficient use of social networks, systematic discussion groups and non-intuitive targeted controls them by involved factors and it may be effective in the teaching process regarding to creativity. Meanwhile, creating a virtual classroom the style of class recognizes formally the creativity. Also the use of a common model of creative thinking between student/teacher is effective to solve problems of virtual classroom. It is recommended to virtual education’ authorities in Iran to have a special review to the virtual curriculum for increasing creativity in educational content and such classes to be witnesses more creative in Iran's cities.Keywords: virtual learning, creativity, e-learning, bioinformatics, biomedicine
Procedia PDF Downloads 36324443 Data-Driven Market Segmentation in Hospitality Using Unsupervised Machine Learning
Authors: Rik van Leeuwen, Ger Koole
Abstract:
Within hospitality, marketing departments use segmentation to create tailored strategies to ensure personalized marketing. This study provides a data-driven approach by segmenting guest profiles via hierarchical clustering based on an extensive set of features. The industry requires understandable outcomes that contribute to adaptability for marketing departments to make data-driven decisions and ultimately driving profit. A marketing department specified a business question that guides the unsupervised machine learning algorithm. Features of guests change over time; therefore, there is a probability that guests transition from one segment to another. The purpose of the study is to provide steps in the process from raw data to actionable insights, which serve as a guideline for how hospitality companies can adopt an algorithmic approach.Keywords: hierarchical cluster analysis, hospitality, market segmentation
Procedia PDF Downloads 10824442 Geographic Information System for Simulating Air Traffic By Applying Different Multi-Radar Positioning Techniques
Authors: Amara Rafik, Mostefa Belhadj Aissa
Abstract:
Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.Keywords: ATM, GIS, radar data, simulation
Procedia PDF Downloads 11924441 Exploring Gaming-Learning Interaction in MMOG Using Data Mining Methods
Authors: Meng-Tzu Cheng, Louisa Rosenheck, Chen-Yen Lin, Eric Klopfer
Abstract:
The purpose of the research is to explore some of the ways in which gameplay data can be analyzed to yield results that feedback into the learning ecosystem. Back-end data for all users as they played an MMOG, The Radix Endeavor, was collected, and this study reports the analyses on a specific genetics quest by using the data mining techniques, including the decision tree method. In the study, different reasons for quest failure between participants who eventually succeeded and who never succeeded were revealed. Regarding the in-game tools use, trait examiner was a key tool in the quest completion process. Subsequently, the results of decision tree showed that a lack of trait examiner usage can be made up with additional Punnett square uses, displaying multiple pathways to success in this quest. The methods of analysis used in this study and the resulting usage patterns indicate some useful ways that gameplay data can provide insights in two main areas. The first is for game designers to know how players are interacting with and learning from their game. The second is for players themselves as well as their teachers to get information on how they are progressing through the game, and to provide help they may need based on strategies and misconceptions identified in the data.Keywords: MMOG, decision tree, genetics, gaming-learning interaction
Procedia PDF Downloads 35824440 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms
Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy
Abstract:
Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu
Procedia PDF Downloads 49024439 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems
Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas
Abstract:
This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.Keywords: transportation networks, freight delivery, data flow, monitoring, e-services
Procedia PDF Downloads 12924438 Regulating Issues concerning Data Protection in Cloud Computing: Developing a Saudi Approach
Authors: Jumana Majdi Qutub
Abstract:
Rationale: Cloud computing has rapidly developed the past few years. Because of the importance of providing protection for personal data used in cloud computing, the role of data protection in promoting trust and confidence in users’ data has become an important policy priority. This research examines key regulatory challenges rose by the growing use and importance of cloud computing with focusing on protection of individuals personal data. Methodology: Describing and analyzing governance challenges facing policymakers and industry in Saudi Arabia, with an account of anticipated governance responses. The aim of the research is to describe and define the regulatory challenges on cloud computing for policy making in Saudi Arabia and comparing it with potential complied issues rose in respect of transported data to EU member state. In addition, it discusses information privacy issues. Finally, the research proposes policy recommendation that would resolve concerns surrounds the privacy and effectiveness of clouds computing frameworks for data protection. Results: There are still no clear regulation in Saudi Arabia specialized in legalizing cloud computing and specialty regulations in transferring data internationally and locally. Decision makers need to review the applicable law in Saudi Arabia that protect information in cloud computing. This should be from an international and a local view in order to identify all requirements surrounding this area. It is important to educate cloud computing users about their information value and rights before putting it in the cloud to avoid further legal complications, such as making an educational program to prevent giving personal information to a bank employee. Therefore, with many kinds of cloud computing services, it is important to have it covered by the law in all aspects.Keywords: cloud computing, cyber crime, data protection, privacy
Procedia PDF Downloads 26224437 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran
Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh
Abstract:
Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.Keywords: Malmquist Index, Grey's Theory, CCR Model, network data envelopment analysis, Iran electricity power chain
Procedia PDF Downloads 16624436 Cloud Shield: Model to Secure User Data While Using Content Delivery Network Services
Authors: Rachna Jain, Sushila Madan, Bindu Garg
Abstract:
Cloud computing is the key powerhouse in numerous organizations due to shifting of their data to the cloud environment. In recent years it has been observed that cloud-based-services are being used on large scale for content storage, distribution and processing. Various issues have been observed in cloud computing environment that need to be addressed. Security and privacy are found topmost concern area. In this paper, a novel security model is proposed to secure data by utilizing CDN services like image to icon conversion. CDN Service is a content delivery service which converts an image to icon, word to pdf & Latex to pdf etc. Presented model is used to convert an image into icon by keeping image secret. Here security of image is imparted so that image should be encrypted and decrypted by data owners only. It is also discussed in the paper that how server performs multiplication and selection on encrypted data without decryption. The data can be image file, word file, audio or video file. Moreover, the proposed model is capable enough to multiply images, encrypt them and send to a server application for conversion. Eventually, the prime objective is to encrypt an image and convert the encrypted image to image Icon by utilizing homomorphic encryption.Keywords: cloud computing, user data security, homomorphic encryption, image multiplication, CDN service
Procedia PDF Downloads 33624435 A Proteomic Approach for Discovery of Microbial Cellulolytic Enzymes
Authors: M. S. Matlala, I. Ignatious
Abstract:
Environmental sustainability has taken the center stage in human life all over the world. Energy is the most essential component of our life. The conventional sources of energy are non-renewable and have a detrimental environmental impact. Therefore, there is a need to move from conventional to non-conventional renewable energy sources to satisfy the world’s energy demands. The study aimed at screening for microbial cellulolytic enzymes using a proteomic approach. The objectives were to screen for microbial cellulases with high specific activity and separate the cellulolytic enzymes using a combination of zymography and two-dimensional (2-D) gel electrophoresis followed by tryptic digestion, Matrix-assisted Laser Desorption Ionisation-Time of Flight (MALDI-TOF) and bioinformatics analysis. Fungal and bacterial isolates were cultured in M9 minimal and Mandel media for a period of 168 hours at 60°C and 30°C with cellobiose and Avicel as carbon sources. Microbial cells were separated from supernatants through centrifugation, and the crude enzyme from the cultures was used for the determination of cellulase activity, zymography, SDS-PAGE, and two-dimensional gel electrophoresis. Five isolates, with lytic action on carbon sources studied, were a bacterial strain (BARK) and fungal strains (VCFF1, VCFF14, VCFF17, and VCFF18). Peak cellulase production by the selected isolates was found to be 3.8U/ml, 2.09U/ml, 3.38U/ml, 3.18U/ml, and 1.95U/ml, respectively. Two-dimensional gel protein maps resulted in the separation and quantitative expression of different proteins by the microbial isolates. MALDI-TOF analysis and database search showed that the expressed proteins in this study closely relate to different glycoside hydrolases produced by other microbial species with an acceptable confidence level of 100%.Keywords: cellulases, energy, two-dimensional gel electrophoresis, matrix-assisted laser desorption ionisation-time of flight, MALDI-TOF MS
Procedia PDF Downloads 13424434 Data Mining Approach: Classification Model Evaluation
Authors: Lubabatu Sada Sodangi
Abstract:
The rapid growth in exchange and accessibility of information via the internet makes many organisations acquire data on their own operation. The aim of data mining is to analyse the different behaviour of a dataset using observation. Although, the subset of the dataset being analysed may not display all the behaviours and relationships of the entire data and, therefore, may not represent other parts that exist in the dataset. There is a range of techniques used in data mining to determine the hidden or unknown information in datasets. In this paper, the performance of two algorithms Chi-Square Automatic Interaction Detection (CHAID) and multilayer perceptron (MLP) would be matched using an Adult dataset to find out the percentage of an/the adults that earn > 50k and those that earn <= 50k per year. The two algorithms were studied and compared using IBM SPSS statistics software. The result for CHAID shows that the most important predictors are relationship and education. The algorithm shows that those are married (husband) and have qualification: Bachelor, Masters, Doctorate or Prof-school whose their age is > 41<57 earn > 50k. Also, multilayer perceptron displays marital status and capital gain as the most important predictors of the income. It also shows that individuals that their capital gain is less than 6,849 and are single, separated or widow, earn <= 50K, whereas individuals with their capital gain is > 6,849, work > 35 hrs/wk, and > 27yrs their income will be > 50k. By comparing the two algorithms, it is observed that both algorithms are reliable but there is strong reliability in CHAID which clearly shows that relation and education contribute to the prediction as displayed in the data visualisation.Keywords: data mining, CHAID, multi-layer perceptron, SPSS, Adult dataset
Procedia PDF Downloads 37824433 Developing an Information Model of Manufacturing Process for Sustainability
Authors: Jae Hyun Lee
Abstract:
Manufacturing companies use life-cycle inventory databases to analyze sustainability of their manufacturing processes. Life cycle inventory data provides reference data which may not be accurate for a specific company. Collecting accurate data of manufacturing processes for a specific company requires enormous time and efforts. An information model of typical manufacturing processes can reduce time and efforts to get appropriate reference data for a specific company. This paper shows an attempt to build an abstract information model which can be used to develop information models for specific manufacturing processes.Keywords: process information model, sustainability, OWL, manufacturing
Procedia PDF Downloads 43024432 An Interpretable Data-Driven Approach for the Stratification of the Cardiorespiratory Fitness
Authors: D.Mendes, J. Henriques, P. Carvalho, T. Rocha, S. Paredes, R. Cabiddu, R. Trimer, R. Mendes, A. Borghi-Silva, L. Kaminsky, E. Ashley, R. Arena, J. Myers
Abstract:
The continued exploration of clinically relevant predictive models continues to be an important pursuit. Cardiorespiratory fitness (CRF) portends clinical vital information and as such its accurate prediction is of high importance. Therefore, the aim of the current study was to develop a data-driven model, based on computational intelligence techniques and, in particular, clustering approaches, to predict CRF. Two prediction models were implemented and compared: 1) the traditional Wasserman/Hansen Equations; and 2) an interpretable clustering approach. Data used for this analysis were from the 'FRIEND - Fitness Registry and the Importance of Exercise: The National Data Base'; in the present study a subset of 10690 apparently healthy individuals were utilized. The accuracy of the models was performed through the computation of sensitivity, specificity, and geometric mean values. The results show the superiority of the clustering approach in the accurate estimation of CRF (i.e., maximal oxygen consumption).Keywords: cardiorespiratory fitness, data-driven models, knowledge extraction, machine learning
Procedia PDF Downloads 28624431 Autoimmune Diseases Associated with Primary Biliary Cirrhosis: A Retrospective Study of 51 Patients
Authors: Soumaya Mrabet, Imen Akkari, Amira Atig, Elhem Ben Jazia
Abstract:
Introduction: Primary biliary cirrhosis (PBC) is a cholestatic cholangitis of unknown etiology. It is frequently associated with autoimmune diseases, which explains their systematic screening. The aim of our study was to determine the prevalence and the type of autoimmune disorders associated with PBC and to assess their impact on the prognosis of the disease. Material and methods: It is a retrospective study over a period of 16 years (2000-2015) including all patients followed for PBC. In all these patients we have systematically researched: dysthyroidism (thyroid balance, antithyroid autoantibodies), type 1 diabetes, dry syndrome (ophthalmologic examination, Schirmer test and lip biopsy in case of Presence of suggestive clinical signs), celiac disease(celiac disease serology and duodenal biopsies) and dermatological involvement (clinical examination). Results: Fifty-one patients (50 women and one men) followed for PBC were collected. The Mean age was 54 years (37-77 years). Among these patients, 30 patients(58.8%) had at least one autoimmune disease associated with PBC. The discovery of these autoimmune diseases preceded the diagnosis of PBC in 8 cases (26.6%) and was concomitant, through systematic screening, in the remaining cases. Autoimmune hepatitis was found in 12 patients (40%), defining thus an overlap syndrome. Other diseases were Hashimoto's thyroiditis (n = 10), dry syndrome (n = 7), Gougerot Sjogren syndrome (n=6), celiac disease (n = 3), insulin-dependent diabetes (n = 1), scleroderma (n = 1), rheumatoid arthritis (n = 1), Biermer Anemia (n=1) and Systemic erythematosus lupus (n=1). The two groups of patients with PBC with or without associated autoimmune disorders were comparable for bilirubin levels, Child-Pugh score, and response to treatment. Conclusion: In our series, the prevalence of autoimmune diseases in PBC was 58.8%. These diseases were dominated by autoimmune hepatitis and Hashimoto's thyroiditis. Even if their association does not seem to alter the prognosis, screening should be systematic in order to institute an early and adequate management.Keywords: autoimmune diseases, autoimmune hepatitis, primary biliary cirrhosis, prognosis
Procedia PDF Downloads 27624430 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency
Authors: Rania Alshikhe, Vinita Jindal
Abstract:
Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from traveling vehicles, such as taxis through installed global positioning system (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE
Procedia PDF Downloads 15824429 Understanding the Social Movements around the ‘Rohingya Crisis’ within the Political Process Model
Authors: Aklima Jesmin, Ubaidur Rob, M. Ashrafur Rahman
Abstract:
Rohingya population of Arakan state in Myanmar are one the most persecuted ethnic minorities in this 21st century. According to the Universal Declaration of Human Rights (UDHR), all human beings are born free, equal in dignity and rights. However, these populations are systematically excluded from this universal proclamation of human rights as they are Rohingya, which signify ‘other’. Based on the accessible and available literatures about Rohingya issue, this study firstly found there are chronological pattern of human rights violations against the ethnic Rohingya which follows the pathology of the Holocaust in this 21st century of human civilization. These violations have been possible due to modern technology, bureaucracy which has been performed through authorization, routinization and dehumanization; not only in formal institutions but in the society as a whole. This kind of apparently never-ending situation poses any author with the problem of available many scientific articles. The most important sources are, therefore the international daily newspapers, social media and official webpage of the non-state actors for nitty-gritty day to day update. Although it challenges the validity and objectivity of the information, but to address the critical ongoing human rights violations against Rohingya population can become a base for further work on this issue. One of the aspects of this paper is to accommodate all the social movements since August 2017 to date. The findings of this paper is that even though it seemed only human rights violations occurred against Rohingya historically but, simultaneously the process of social movements had also started, can be traced more after the military campaign in 2017. Therefore, the Rohingya crisis can be conceptualized within one ‘campaign’ movement for justice, not as episodic events, especially within the Political Process Model than any other social movement theories. This model identifies that the role of international political movements as well as the role of non-state actors are more powerful than any other episodes of violence conducted against Rohinyga in reframing issue, blaming and shaming to Myanmar government and creating the strategic opportunities for social changes. The lack of empowerment of the affected Rohingya population has been found as the loop to utilize this strategic opportunity. Their lack of empowerment can also affect their capacity to reframe their rights and to manage the campaign for their justice. Therefore, this should be placed at the heart of the international policy agenda within the broader socio-political movement for the justice of Rohingya population. Without ensuring human rights of Rohingya population, achieving the promise of the united nation’s sustainable development goals - no one would be excluded – will be impossible.Keywords: civilization, holocaust, human rights violation, military campaign, political process model, Rohingya population, sustainable development goal, social justice, social movement, strategic opportunity
Procedia PDF Downloads 28424428 Mitigating Supply Chain Risk for Sustainability Using Big Data Knowledge: Evidence from the Manufacturing Supply Chain
Authors: Mani Venkatesh, Catarina Delgado, Purvishkumar Patel
Abstract:
The sustainable supply chain is gaining popularity among practitioners because of increased environmental degradation and stakeholder awareness. On the other hand supply chain, risk management is very crucial for the practitioners as it potentially disrupts supply chain operations. Prediction and addressing the risk caused by social issues in the supply chain is paramount importance to the sustainable enterprise. More recently, the usage of Big data analytics for forecasting business trends has been gaining momentum among professionals. The aim of the research is to explore the application of big data, predictive analytics in successfully mitigating supply chain social risk and demonstrate how such mitigation can help in achieving sustainability (environmental, economic & social). The method involves the identification and validation of social issues in the supply chain by an expert panel and survey. Later, we used a case study to illustrate the application of big data in the successful identification and mitigation of social issues in the supply chain. Our result shows that the company can predict various social issues through big data, predictive analytics and mitigate the social risk. We also discuss the implication of this research to the body of knowledge and practice.Keywords: big data, sustainability, supply chain social sustainability, social risk, case study
Procedia PDF Downloads 410