Search results for: decentralized data management
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30731

Search results for: decentralized data management

26921 Automated Multisensory Data Collection System for Continuous Monitoring of Refrigerating Appliances Recycling Plants

Authors: Georgii Emelianov, Mikhail Polikarpov, Fabian Hübner, Jochen Deuse, Jochen Schiemann

Abstract:

Recycling refrigerating appliances plays a major role in protecting the Earth's atmosphere from ozone depletion and emissions of greenhouse gases. The performance of refrigerator recycling plants in terms of material retention is the subject of strict environmental certifications and is reviewed periodically through specialized audits. The continuous collection of Refrigerator data required for the input-output analysis is still mostly manual, error-prone, and not digitalized. In this paper, we propose an automated data collection system for recycling plants in order to deduce expected material contents in individual end-of-life refrigerating appliances. The system utilizes laser scanner measurements and optical data to extract attributes of individual refrigerators by applying transfer learning with pre-trained vision models and optical character recognition. Based on Recognized features, the system automatically provides material categories and target values of contained material masses, especially foaming and cooling agents. The presented data collection system paves the way for continuous performance monitoring and efficient control of refrigerator recycling plants.

Keywords: automation, data collection, performance monitoring, recycling, refrigerators

Procedia PDF Downloads 159
26920 Study and Improvement of the Quality of a Production Line

Authors: S. Bouchami, M.N. Lakhoua

Abstract:

The automotive market is a dynamic market that continues to grow. That’s why several companies belonging to this sector adopt a quality improvement approach. Wanting to be competitive and successful in the environment in which they operate, these companies are dedicated to establishing a system of quality management to ensure the achievement of the objective quality, improving the products and process as well as the satisfaction of the customers. In this paper, the management of the quality and the improvement of a production line in an industrial company is presented. In fact, the project is divided into two essential parts: the creation of the technical line documentation and the quality assurance documentation and the resolution of defects at the line, as well as those claimed by the customer. The creation of the documents has required a deep understanding of the manufacturing process. The analysis and problem solving were done through the implementation of PDCA (Plan Do Check Act) and FTA (Fault Tree Analysis). As perspective, in order to better optimize production and improve the efficiency of the production line, a study on the problems associated with the supply of raw materials should be made to solve the problems of stock-outs which cause delays penalizing for the industrial company.

Keywords: quality management, documentary system, Plan Do Check Act (PDCA), fault tree analysis (FTA) method

Procedia PDF Downloads 139
26919 Probability Sampling in Matched Case-Control Study in Drug Abuse

Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell

Abstract:

Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.

Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling

Procedia PDF Downloads 488
26918 The Approach to Develop Value Chain to Enhance the Management Efficiency of Thai Tour Operators in Order to Support Free Trade within the Framework of ASEAN Cooperation

Authors: Yalisa Tonsorn

Abstract:

The objectives of this study are 1) to study the readiness of Thai tour operators in order to prepare for being ASEAN members, 2) to study opportunity and obstacles of the management of Thai tour operators, and 3) to find approach for developing value chain in order to enhance the management efficiency of Thai tour operators in order to support free trade within the framework of ASEAN cooperation. The research methodology is mixed between qualitative method and quantitative method. In-depth interview was done with key informants, including management supervisors, medium managers, and officers of the travel agencies. The questionnaire was conducted with 300 sampling. According to the study, it was found that the approach for developing value chain to enhance the management efficiency of Thai travel agencies in order to support free trade within the framework of ASEAN cooperation, the tour operators must give priority to the customer and deliver the service exceeding the customer’s expectation. There are 2 groups of customers: 1) external customers referring to tourist, and 2) internal customers referring to staff who deliver the service to the customers, including supervisors, colleagues, or subordinates. There are 2 issues which need to be developed: 1) human resource development in order to cultivate the working concept by focusing on importance of customers, and excellent service providing, and 2) working system development by building value and innovation in operational process including services to the company in order to deliver the highest impressive service to both internal and external customers. Moreover, the tour operators could support the increased number of tourists significantly. This could enhance the capacity of the business and affect the increase of competition capability in the economic dimension of the country.

Keywords: AEC (ASEAN Economic Eommunity), core activities, support activities, values chain

Procedia PDF Downloads 351
26917 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 359
26916 Rubbish to Rupees: The Story of Bishanpur Tzeco Panchayat, Bhagalpur District, State- Bihar, India

Authors: Arvind Kumar

Abstract:

Bishanpur Tzecho Panchayat presents exemplary evidence of community efforts backed by convergent action by the district water and sanitation mission in management of solid waste enhancing prosperity in the area and improvement in the quality of life. BishanpurTzeco Panchayat faced a major problem of waste management with garbage, cow dung piling up in public places leading to protests by residents. To address this problem, in collaboration with the Agriculture University and support of district administration, PHED ( Public Health & Engineering Department) and the district and block coordinators of SBM ( Swachh Bharat Mission), communities decided to go for vermicomposting to get rid of the menace of cow dung and other solid home and farm waste. Today, Bishanpur is largely garbage free, as the people realize the value of waste and how can it contribute to their well-being and prosperity. The people of the Panchayat have demonstrated that waste is a resource. Bishanpur Tzecho is a panchayat of Goradih Block of Bhagalpur district, the silk city of Bihar, India.

Keywords: solid waste management in Bishanpur Tzeco Panchayat, Bhagalpur district, State- Bihar, India

Procedia PDF Downloads 408
26915 The Importance of Efficient and Sustainable Water Resources Management and the Role of Artificial Intelligence in Preventing Forced Migration

Authors: Fateme Aysin Anka, Farzad Kiani

Abstract:

Forced migration is a situation in which people are forced to leave their homes against their will due to political conflicts, wars and conflicts, natural disasters, climate change, economic crises, or other emergencies. This type of migration takes place under conditions where people cannot lead a sustainable life due to reasons such as security, shelter and meeting their basic needs. This type of migration may occur in connection with different factors that affect people's living conditions. In addition to these general and widespread reasons, water security and resources will be one that is starting now and will be encountered more and more in the future. Forced migration may occur due to insufficient or depleted water resources in the areas where people live. In this case, people's living conditions become unsustainable, and they may have to go elsewhere, as they cannot obtain their basic needs, such as drinking water, water used for agriculture and industry. To cope with these situations, it is important to minimize the causes, as international organizations and societies must provide assistance (for example, humanitarian aid, shelter, medical support and education) and protection to address (or mitigate) this problem. From the international perspective, plans such as the Green New Deal (GND) and the European Green Deal (EGD) draw attention to the need for people to live equally in a cleaner and greener world. Especially recently, with the advancement of technology, science and methods have become more efficient. In this regard, in this article, a multidisciplinary case model is presented by reinforcing the water problem with an engineering approach within the framework of the social dimension. It is worth emphasizing that this problem is largely linked to climate change and the lack of a sustainable water management perspective. As a matter of fact, the United Nations Development Agency (UNDA) draws attention to this problem in its universally accepted sustainable development goals. Therefore, an artificial intelligence-based approach has been applied to solve this problem by focusing on the water management problem. The most general but also important aspect in the management of water resources is its correct consumption. In this context, the artificial intelligence-based system undertakes tasks such as water demand forecasting and distribution management, emergency and crisis management, water pollution detection and prevention, and maintenance and repair control and forecasting.

Keywords: water resource management, forced migration, multidisciplinary studies, artificial intelligence

Procedia PDF Downloads 81
26914 Evaluating the Effectiveness of Science Teacher Training Programme in National Colleges of Education: a Preliminary Study, Perceptions of Prospective Teachers

Authors: A. S. V Polgampala, F. Huang

Abstract:

This is an overview of what is entailed in an evaluation and issues to be aware of when class observation is being done. This study examined the effects of evaluating teaching practice of a 7-day ‘block teaching’ session in a pre -service science teacher training program at a reputed National College of Education in Sri Lanka. Effects were assessed in three areas: evaluation of the training process, evaluation of the training impact, and evaluation of the training procedure. Data for this study were collected by class observation of 18 teachers during 9th February to 16th of 2017. Prospective teachers of science teaching, the participants of the study were evaluated based on newly introduced format by the NIE. The data collected was analyzed qualitatively using the Miles and Huberman procedure for analyzing qualitative data: data reduction, data display and conclusion drawing/verification. It was observed that the trainees showed their confidence in teaching those competencies and skills. Teacher educators’ dissatisfaction has been a great impact on evaluation process.

Keywords: evaluation, perceptions & perspectives, pre-service, science teachering

Procedia PDF Downloads 310
26913 Changing Governance and the Role of People's Involvement in Municipal Solid Waste Management: Study of Two Municipal Corporations in Kerala

Authors: Prathibha Ganesan

Abstract:

This paper discusses discontents of inhabitants in the landfills and its culmination into resistance against centralised waste disposal during the last three decades in Kerala. The study is based on a sample survey of 175 households located in the landfill sites and city limits of two Municipal Corporations viz. Thrissur and Cochin. The study found that waste is dumped in the periphery of the urban area where economically and socially vulnerable people are densely populated. Moreover, landfill sites are unscientifically managed to cause severe socio-economic and health issues to the local people, finally leading to their mobilisation and persistent struggle. The struggles often culminate in the closure of landfills or forced relocation or abandonment of the region by the community. The study concluded that persistent people’s struggles compel the local state to either find alternatives to centralised solid waste management system or use political power to subsume the local resistance. The persistence of the struggles determined the type waste governance adopted by the local governments.

Keywords: solid waste management, municipal corporation, resistance movements, urban, Kerala

Procedia PDF Downloads 256
26912 Detecting Venomous Files in IDS Using an Approach Based on Data Mining Algorithm

Authors: Sukhleen Kaur

Abstract:

In security groundwork, Intrusion Detection System (IDS) has become an important component. The IDS has received increasing attention in recent years. IDS is one of the effective way to detect different kinds of attacks and malicious codes in a network and help us to secure the network. Data mining techniques can be implemented to IDS, which analyses the large amount of data and gives better results. Data mining can contribute to improving intrusion detection by adding a level of focus to anomaly detection. So far the study has been carried out on finding the attacks but this paper detects the malicious files. Some intruders do not attack directly, but they hide some harmful code inside the files or may corrupt those file and attack the system. These files are detected according to some defined parameters which will form two lists of files as normal files and harmful files. After that data mining will be performed. In this paper a hybrid classifier has been used via Naive Bayes and Ripper classification methods. The results show how the uploaded file in the database will be tested against the parameters and then it is characterised as either normal or harmful file and after that the mining is performed. Moreover, when a user tries to mine on harmful file it will generate an exception that mining cannot be made on corrupted or harmful files.

Keywords: data mining, association, classification, clustering, decision tree, intrusion detection system, misuse detection, anomaly detection, naive Bayes, ripper

Procedia PDF Downloads 410
26911 Generalized Approach to Linear Data Transformation

Authors: Abhijith Asok

Abstract:

This paper presents a generalized approach for the simple linear data transformation, Y=bX, through an integration of multidimensional coordinate geometry, vector space theory and polygonal geometry. The scaling is performed by adding an additional ’Dummy Dimension’ to the n-dimensional data, which helps plot two dimensional component-wise straight lines on pairs of dimensions. The end result is a set of scaled extensions of observations in any of the 2n spatial divisions, where n is the total number of applicable dimensions/dataset variables, created by shifting the n-dimensional plane along the ’Dummy Axis’. The derived scaling factor was found to be dependent on the coordinates of the common point of origin for diverging straight lines and the plane of extension, chosen on and perpendicular to the ’Dummy Axis’, respectively. This result indicates the geometrical interpretation of a linear data transformation and hence, opportunities for a more informed choice of the factor ’b’, based on a better choice of these coordinate values. The paper follows on to identify the effect of this transformation on certain popular distance metrics, wherein for many, the distance metric retained the same scaling factor as that of the features.

Keywords: data transformation, dummy dimension, linear transformation, scaling

Procedia PDF Downloads 294
26910 Initiative Strategies on How to Increase Value Add of the Recycling Business

Authors: Yananda Siraphatthada

Abstract:

The current study was the succession of a previous study on value added of recycling business management. Its aims are to 1) explore conditions on how to increasing value add of Thai recycling business, and 2) exam the implementation of the 3-staged plan (short, medium, and long term), suggested by the former study, to increase value added of the recycling business as immediate mechanisms to accelerate government operation. Quantitative and qualitative methods were utilized in this research. A qualitative research consisted of in-depth interviews and focus group discussions. Responses were obtained from owners of the waste separation plants, and recycle shops, as well as officers in relevant governmental agencies. They were randomly selected via Quota Sampling. Data was analyzed via content analysis. The sample used for quantitative method consisted of 1,274 licensed recycling operators in eight provinces. The operators were randomly stratified via sampling method. Data were analyzed via descriptive statistics frequency, percentage, average (mean), and standard deviation. The study recommended three-staged plan: short, medium, and long terms. The plan included the development of logistics, the provision of quality market/plants, the amendment of recycling rules/regulation, the restructuring recycling business, the establishment of green-purchasing recycling center, support for the campaigns run by the International Green Purchasing Network (IGPN), conferences/workshops as a public forum to share insights among experts/concern people.

Keywords: strategies, value added, recycle, business

Procedia PDF Downloads 235
26909 Analyzing Transit Network Design versus Urban Dispersion

Authors: Hugo Badia

Abstract:

This research answers which is the most suitable transit network structure to serve specific demand requirements in an increasing urban dispersion process. Two main approaches of network design are found in the literature. On the one hand, a traditional answer, widespread in our cities, that develops a high number of lines to connect most of origin-destination pairs by direct trips; an approach based on the idea that users averse to transfers. On the other hand, some authors advocate an alternative design characterized by simple networks where transfer is essential to complete most of trips. To answer which of them is the best option, we use a two-step methodology. First, by means of an analytical model, three basic network structures are compared: a radial scheme, starting point for the other two structures, a direct trip-based network, and a transfer-based one, which represent the two alternative transit network designs. The model optimizes the network configuration with regard to the total cost for each structure. For a scenario of dispersion, the best alternative is the structure with the minimum cost. This dispersion degree is defined in a simple way considering that only a central area attracts all trips. If this area is small, we have a high concentrated mobility pattern; if this area is too large, the city is highly decentralized. In this first step, we can determine the area of applicability for each structure in function to that urban dispersion degree. The analytical results show that a radial structure is suitable when the demand is so centralized, however, when this demand starts to scatter, new transit lines should be implemented to avoid transfers. If the urban dispersion advances, the introduction of more lines is no longer a good alternative, in this case, the best solution is a change of structure, from direct trips to a network based on transfers. The area of applicability of each network strategy is not constant, it depends on the characteristics of demand, city and transport technology. In the second step, we translate analytical results to a real case study by the relationship between the parameters of dispersion of the model and direct measures of dispersion in a real city. Two dimensions of the urban sprawl process are considered: concentration, defined by Gini coefficient, and centralization by area based centralization index. Once it is estimated the real dispersion degree, we are able to identify in which area of applicability the city is located. In summary, from a strategic point of view, we can obtain with this methodology which is the best network design approach for a city, comparing the theoretical results with the real dispersion degree.

Keywords: analytical network design model, network structure, public transport, urban dispersion

Procedia PDF Downloads 228
26908 Comparison of the Effects of Alprazolam and Zaleplon on Anxiety Levels in Patients Undergoing Abdominal Gynecological Surgery

Authors: Shekoufeh Behdad, Amirhossein Yadegari, Leila Ghodrati, Saman Yadegari

Abstract:

Context: Preoperative anxiety is a common psychological reaction experienced by all patients undergoing surgery. It can have negative effects on the patient's well-being and even impact surgical outcomes. Therefore, finding effective interventions to reduce preoperative anxiety is important in improving patient care. Research Aim: The aim of this study is to compare the effects of oral administration of zaleplon (5 mg) and alprazolam (0.5 mg) on preoperative anxiety levels in women undergoing gynecological abdominal surgery. Methodology: This study is a double-blind, randomized clinical trial conducted after receiving approval from the university's ethics committee and obtaining written informed consent from the patients. The night before the surgery, patients were randomly assigned to receive either 0.5 mg of alprazolam or 5 mg of zaleplon orally. Anxiety levels, measured using a 10-cm visual analog scale, and hemodynamic variables (blood pressure and heart rate) were assessed before drug administration and on the morning of the operation after the patient entered the pre-operation room. Findings: The study found that there were no significant differences in mean anxiety levels or hemodynamic variables before and after administration of either drug in both groups (P value > 0.05). This suggests that both 0.5 mg of alprazolam and 5 mg of zaleplon effectively reduce preoperative anxiety in women undergoing abdominal surgery without serious side effects. Theoretical Importance: This study contributes to the understanding of the effectiveness of alprazolam and zaleplon in reducing preoperative anxiety. It adds to the existing literature on pharmacological interventions for anxiety management, specifically in the context of gynecological abdominal surgery. Data Collection: Data for this study were collected through the assessment of anxiety levels using a visual analog scale and measuring hemodynamic variables, including systolic, diastolic, and mean arterial blood pressures, as well as heart rate. These measurements were taken before drug administration and on the morning of the surgery. Analysis Procedures: Statistical analysis was performed to compare the mean anxiety levels and hemodynamic variables before and after drug administration in the two groups. The significance of the differences was determined using appropriate statistical tests. Questions Addressed: This study aimed to answer the question of whether there are differences in the effects of alprazolam and zaleplon on preoperative anxiety levels in women undergoing gynecological abdominal surgery. Conclusion: The oral administration of both 0.5 mg of alprazolam and 5 mg of zaleplon the night before surgery effectively reduces preoperative anxiety in women undergoing abdominal surgery. These findings have important implications for the management of preoperative anxiety and can contribute to improving the overall surgical experience for patients.

Keywords: zaleplon, alprazolam, premedication, abdominal surgery

Procedia PDF Downloads 75
26907 Carbon Sequestration in Spatio-Temporal Vegetation Dynamics

Authors: Nothando Gwazani, K. R. Marembo

Abstract:

An increase in the atmospheric concentration of carbon dioxide (CO₂) from fossil fuel and land use change necessitates identification of strategies for mitigating threats associated with global warming. Oceans are insufficient to offset the accelerating rate of carbon emission. However, the challenges of oceans as a source of reducing carbon footprint can be effectively overcome by the storage of carbon in terrestrial carbon sinks. The gases with special optical properties that are responsible for climate warming include carbon dioxide (CO₂), water vapors, methane (CH₄), nitrous oxide (N₂O), nitrogen oxides (NOₓ), stratospheric ozone (O₃), carbon monoxide (CO) and chlorofluorocarbons (CFC’s). Amongst these, CO₂ plays a crucial role as it contributes to 50% of the total greenhouse effect and has been linked to climate change. Because plants act as carbon sinks, interest in terrestrial carbon sequestration has increased in an effort to explore opportunities for climate change mitigation. Removal of carbon from the atmosphere is a topical issue that addresses one important aspect of an overall strategy for carbon management namely to help mitigate the increasing emissions of CO₂. Thus, terrestrial ecosystems have gained importance for their potential to sequester carbon and reduce carbon sink in oceans, which have a substantial impact on the ocean species. Field data and electromagnetic spectrum bands were analyzed using ArcGIS 10.2, QGIS 2.8 and ERDAS IMAGINE 2015 to examine the vegetation distribution. Satellite remote sensing data coupled with Normalized Difference Vegetation Index (NDVI) was employed to assess future potential changes in vegetation distributions in Eastern Cape Province of South Africa. The observed 5-year interval analysis examines the amount of carbon absorbed using vegetation distribution. In 2015, the numerical results showed low vegetation distribution, therefore increased the acidity of the oceans and gravely affected fish species and corals. The outcomes suggest that the study area could be effectively utilized for carbon sequestration so as to mitigate ocean acidification. The vegetation changes measured through this investigation suggest an environmental shift and reduced vegetation carbon sink, and that threatens biodiversity and ecosystem. In order to sustain the amount of carbon in the terrestrial ecosystems, the identified ecological factors should be enhanced through the application of good land and forest management practices. This will increase the carbon stock of terrestrial ecosystems thereby reducing direct loss to the atmosphere.

Keywords: remote sensing, vegetation dynamics, carbon sequestration, terrestrial carbon sink

Procedia PDF Downloads 149
26906 Using Learning Apps in the Classroom

Authors: Janet C. Read

Abstract:

UClan set collaboration with Lingokids to assess the Lingokids learning app's impact on learning outcomes in classrooms in the UK for children with ages ranging from 3 to 5 years. Data gathered during the controlled study with 69 children includes attitudinal data, engagement, and learning scores. Data shows that children enjoyment while learning was higher among those children using the game-based app compared to those children using other traditional methods. It’s worth pointing out that engagement when using the learning app was significantly higher than other traditional methods among older children. According to existing literature, there is a direct correlation between engagement, motivation, and learning. Therefore, this study provides relevant data points to conclude that Lingokids learning app serves its purpose of encouraging learning through playful and interactive content. That being said, we believe that learning outcomes should be assessed with a wider range of methods in further studies. Likewise, it would be beneficial to assess the level of usability and playability of the app in order to evaluate the learning app from other angles.

Keywords: learning app, learning outcomes, rapid test activity, Smileyometer, early childhood education, innovative pedagogy

Procedia PDF Downloads 66
26905 A Cloud-Based Federated Identity Management in Europe

Authors: Jesus Carretero, Mario Vasile, Guillermo Izquierdo, Javier Garcia-Blas

Abstract:

Currently, there is a so called ‘identity crisis’ in cybersecurity caused by the substantial security, privacy and usability shortcomings encountered in existing systems for identity management. Federated Identity Management (FIM) could be solution for this crisis, as it is a method that facilitates management of identity processes and policies among collaborating entities without enforcing a global consistency, that is difficult to achieve when there are ID legacy systems. To cope with this problem, the Connecting Europe Facility (CEF) initiative proposed in 2014 a federated solution in anticipation of the adoption of the Regulation (EU) N°910/2014, the so-called eIDAS Regulation. At present, a network of eIDAS Nodes is being deployed at European level to allow that every citizen recognized by a member state is to be recognized within the trust network at European level, enabling the consumption of services in other member states that, until now were not allowed, or whose concession was tedious. This is a very ambitious approach, since it tends to enable cross-border authentication of Member States citizens without the need to unify the authentication method (eID Scheme) of the member state in question. However, this federation is currently managed by member states and it is initially applied only to citizens and public organizations. The goal of this paper is to present the results of a European Project, named eID@Cloud, that focuses on the integration of eID in 5 cloud platforms belonging to authentication service providers of different EU Member States to act as Service Providers (SP) for private entities. We propose an initiative based on a private eID Scheme both for natural and legal persons. The methodology followed in the eID@Cloud project is that each Identity Provider (IdP) is subscribed to an eIDAS Node Connector, requesting for authentication, that is subscribed to an eIDAS Node Proxy Service, issuing authentication assertions. To cope with high loads, load balancing is supported in the eIDAS Node. The eID@Cloud project is still going on, but we already have some important outcomes. First, we have deployed the federation identity nodes and tested it from the security and performance point of view. The pilot prototype has shown the feasibility of deploying this kind of systems, ensuring good performance due to the replication of the eIDAS nodes and the load balance mechanism. Second, our solution avoids the propagation of identity data out of the native domain of the user or entity being identified, which avoids problems well known in cybersecurity due to network interception, man in the middle attack, etc. Last, but not least, this system allows to connect any country or collectivity easily, providing incremental development of the network and avoiding difficult political negotiations to agree on a single authentication format (which would be a major stopper).

Keywords: cybersecurity, identity federation, trust, user authentication

Procedia PDF Downloads 161
26904 Road Safety in the Great Britain: An Exploratory Data Analysis

Authors: Jatin Kumar Choudhary, Naren Rayala, Abbas Eslami Kiasari, Fahimeh Jafari

Abstract:

The Great Britain has one of the safest road networks in the world. However, the consequences of any death or serious injury are devastating for loved ones, as well as for those who help the severely injured. This paper aims to analyse the Great Britain's road safety situation and show the response measures for areas where the total damage caused by accidents can be significantly and quickly reduced. In this paper, we do an exploratory data analysis using STATS19 data. For the past 30 years, the UK has had a good record in reducing fatalities. The UK ranked third based on the number of road deaths per million inhabitants. There were around 165,000 accidents reported in the Great Britain in 2009 and it has been decreasing every year until 2019 which is under 120,000. The government continues to scale back road deaths empowering responsible road users by identifying and prosecuting the parameters that make the roads less safe.

Keywords: road safety, data analysis, openstreetmap, feature expanding.

Procedia PDF Downloads 132
26903 Intrusion Detection System Using Linear Discriminant Analysis

Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou

Abstract:

Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.

Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99

Procedia PDF Downloads 222
26902 The Role of Information Technology in Supply Chain Management

Authors: V. Jagadeesh, K. Venkata Subbaiah, P. Govinda Rao

Abstract:

This paper explaining about the significance of information technology tools and software packages in supply chain management (SCM) in order to manage the entire supply chain. Managing materials flow and financial flow and information flow effectively and efficiently with the aid of information technology tools and packages in order to deliver right quantity with right quality of goods at right time by using right methods and technology. Information technology plays a vital role in streamlining the sales forecasting and demand planning and Inventory control and transportation in supply networks and finally deals with production planning and scheduling. It achieves the objectives by streamlining the business process and integrates within the enterprise and its extended enterprise. SCM starts with customer and it involves sequence of activities from customer, retailer, distributor, manufacturer and supplier within the supply chain framework. It is the process of integrating demand planning and supply network planning and production planning and control. Forecasting indicates the direction for planning raw materials in order to meet the production planning requirements. Inventory control and transportation planning allocate the optimal or economic order quantity by utilizing shortest possible routes to deliver the goods to the customer. Production planning and control utilize the optimal resources mix in order to meet the capacity requirement planning. The above operations can be achieved by using appropriate information technology tools and software packages for the supply chain management.

Keywords: supply chain management, information technology, business process, extended enterprise

Procedia PDF Downloads 371
26901 High-Frequency Cryptocurrency Portfolio Management Using Multi-Agent System Based on Federated Reinforcement Learning

Authors: Sirapop Nuannimnoi, Hojjat Baghban, Ching-Yao Huang

Abstract:

Over the past decade, with the fast development of blockchain technology since the birth of Bitcoin, there has been a massive increase in the usage of Cryptocurrencies. Cryptocurrencies are not seen as an investment opportunity due to the market’s erratic behavior and high price volatility. With the recent success of deep reinforcement learning (DRL), portfolio management can be modeled and automated. In this paper, we propose a novel DRL-based multi-agent system to automatically make proper trading decisions on multiple cryptocurrencies and gain profits in the highly volatile cryptocurrency market. We also extend this multi-agent system with horizontal federated transfer learning for better adapting to the inclusion of new cryptocurrencies in our portfolio; therefore, we can, through the concept of diversification, maximize our profits and minimize the trading risks. Experimental results through multiple simulation scenarios reveal that this proposed algorithmic trading system can offer three promising key advantages over other systems, including maximized profits, minimized risks, and adaptability.

Keywords: cryptocurrency portfolio management, algorithmic trading, federated learning, multi-agent reinforcement learning

Procedia PDF Downloads 113
26900 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions

Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes

Abstract:

The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.

Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning

Procedia PDF Downloads 66
26899 Water Security and Transboundary Issues for Food Security of Ethiopia. The Case of Nile River

Authors: Kebron Asnake

Abstract:

Water security and transboundary issues are critical concerns for countries, particularly in regions where shared water resources are significant. This Research focuses on exploring the challenges and opportunities related to water security and transboundary issues in Ethiopia, using the case of the Nile River. Ethiopia, as a riparian country of the Nile River, faces complex water security issues due to its dependence on this transboundary water resource. This abstract aims to analyze the various factors that affect water security in Ethiopia, including population growth, climate change, and competing water demands. The Study examines the challenges linked to transboundary water management of the Nile River. It delves into the complexities of negotiating water allocations and addressing potential conflicts among the downstream riparian countries. The paper also discusses the role of international agreements and cooperation in promoting sustainable water resource management. Additionally, the paper highlights the opportunities for collaboration and sustainable development that arise from transboundary water management. It explores the potential for joint investments in water infrastructure, hydropower generation, and irrigation systems that can contribute to regional economic growth and water security. Furthermore, the study emphasizes the need for integrated water management approaches in Ethiopia to ensure the equitable and sustainable use of the Nile River's waters. It highlights the importance of involving stakeholders from diverse sectors, including agriculture, energy, and environmental conservation, in decision-making processes. By presenting the case of the Nile River in Ethiopia, this Abstract contributes to the understanding of water security and transboundary issues. It underscores the significance of regional cooperation and informed policy-making to address the challenges and opportunities presented by transboundary water resources. The paper serves as a foundation for further research and policy in water management in Ethiopia and other regions facing similar challenges.

Keywords: water, health, agriculture, medicine

Procedia PDF Downloads 76
26898 Principles and Guidance for the Last Days of Life: Te Ara Whakapiri

Authors: Tania Chalton

Abstract:

In June 2013, an independent review of the Liverpool Care Pathway (LCP) identified a number of problems with the implementation of the LCP in the UK and recommended that it be replaced by individual care plans for each patient. As a result of the UK findings, in November 2013 the Ministry of Health (MOH) commissioned the Palliative Care Council to initiate a programme of work to investigate an appropriate approach for the care of people in their last days of life in New Zealand (NZ). The Last Days of Life Working Group commenced a process to develop national consensus on the care of people in their last days of life in April 2014. In order to develop its advice for the future provision of care to people in their last days of life, the Working Group (WG) established a comprehensive work programme and as a result has developed a series of working papers. Specific areas of focus included: An analysis of the UK Independent Review findings and an assessment of these findings to the NZ context. A stocktake of services providing care to people in their last days of life, including aged residential care (ARC); hospices; hospitals; and primary care. International and NZ literature reviews of evidence and best practice. Survey of family to understand the consumer perspective on the care of people in their last days of life. Key aspects of care that required further considerations for NZ were: Terminology: clarify terminology used in the last days of life and in relation to death and dying. Evidenced based: including specific review of evidence regarding, spiritual, culturally appropriate care as well as dementia care. Diagnosis of dying: need for both guidance around the diagnosis of dying and communication with family. Workforce issues: access to an appropriate workforce after hours. Nutrition and hydration: guidance around appropriate approaches to nutrition and hydration. Symptom and pain management: guidance around symptom management. Documentation: documentation of the person’s care which is robust enough for data collection and auditing requirements, not ‘tick box’ approach to care. Education and training: improved consistency and access to appropriate education and training. Leadership: A dedicated team or person to support and coordinate the introduction and implementation of any last days of life model of care. Quality indicators and data collection: model of care to enable auditing and regular reviews to ensure on-going quality improvement. Cultural and spiritual: address and incorporate any cultural and spiritual aspects. A final document was developed incorporating all the evidence which provides guidance to the health sector on best practice for people at end of life: “Principles and guidance for the last days of life: Te Ara Whakapiri”.

Keywords: end of life, guidelines, New Zealand, palliative care

Procedia PDF Downloads 432
26897 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.

Keywords: rule induction, decision table, missing data, noise

Procedia PDF Downloads 392
26896 Analysis of the Effect of Increased Self-Awareness on the Amount of Food Thrown Away

Authors: Agnieszka Dubiel, Artur Grabowski, Tomasz Przerywacz, Mateusz Roganowicz, Patrycja Zioty

Abstract:

Food waste is one of the most significant challenges humanity is facing nowadays. Every year, reports from global organizations show the scale of the phenomenon, although society's awareness is still insufficient. One-third of the food produced in the world is wasted at various points in the food supply chain. Wastes are present from the delivery through the food preparation and distribution to the end of the sale and consumption. The first step in understanding and resisting the phenomenon is a thorough analysis of the everyday behaviors of humanity. This concept is understood as finding the correlation between the type of food and the reason for throwing it out and wasting it. Those actions were identified as a critical step in the start of work to develop technology to prevent food waste. In this paper, the problem mentioned above was analyzed by focusing on the inhabitants of Central Europe, especially Poland, aged 20-30. This paper provides an insight into collecting data through dedicated software and an organized database. The proposed database contains information on the amount, type, and reasons for wasting food in households. A literature review supported the work to answer research questions, compare the situation in Poland with the problem analyzed in other countries, and find research gaps. The proposed article examines the cause of food waste and its quantity in detail. This review complements previous reviews by emphasizing social and economic innovation in Poland's food waste management. The paper recommends a course of action for future research on food waste management and prevention related to the handling and disposal of food, emphasizing households, i.e., the last link in the supply chain.

Keywords: food waste, food waste reduction, consumer food waste, human-food interaction

Procedia PDF Downloads 114
26895 A Survey of Types and Causes of Medication Errors and Related Factors in Clinical Nurses

Authors: Kouorsh Zarea, Fatemeh Hassani, Samira Beiranvand, Akram Mohamadi

Abstract:

Background and Objectives: Medication error in hospitals is a major cause of the errors which disrupt the health care system. The aim of this study was to assess the nurses’ medication errors and related factors. Material and methods: This was a descriptive study on 225 nurses in various hospitals, selected through multistage random sampling. Data was collected by three researcher made tools; demographic, medication error and related factors questionnaires. Data was analyzed by descriptive statistics, Chi-square, Kruskal-Wallis, One-way analysis of variance. Results: Based on the results obtained, the type of medication errors giving drugs to patients later or earlier (55.6%), multiple oral medication together regardless of their interactions (36%) and the postoperative analgesic without a prescription (34.2%), respectively. In addition, factors such as the shortage of nurses to patients’ ratio (57.3%), high load functions (51.1%) and fatigue caused by the extra work (40.4%), were the most important factors affecting the incidence of medication errors. The fear of legal issues (40%) are the most important factor is the lack of reported medication errors. Conclusions: Based on the results, effective management and promotion motivate nurses. Therefore, increasing scientific and clinical expertise in the field of nursing medication orders is recommended to prevent medication errors in various states of nursing intervention. Employing experienced staff in areas with high risk of medication errors and also supervising less-experienced staff through competent personnel are also suggested.

Keywords: medication error, nurse, clinical care, drug errors

Procedia PDF Downloads 261
26894 Human Resource Information System: Role in HRM Practices and Organizational Performance

Authors: Ejaz Ali M. Phil

Abstract:

Enterprise Resource Planning (ERP) systems are playing a vital role in effective management of business functions in large and complex organizations. Human Resource Information System (HRIS) is a core module of ERP, providing concrete solutions to implement Human Resource Management (HRM) Practices in an innovative and efficient manner. Over the last decade, there has been considerable increase in the studies on HRIS. Nevertheless, previous studies relatively lacked to examine the moderating role of HRIS in performing HRM practices that may affect the firms’ performance. The current study was carried out to examine the impact of HRM practices (training, performance appraisal) on perceived organizational performance, with moderating role of HRIS, where the system is in place. The study based on Resource Based View (RBV) and Ability Motivation Opportunity (AMO) Theories, advocating that strengthening of human capital enables an organization to achieve and sustain competitive advantage which leads to improved organizational performance. Data were collected through structured questionnaire based upon adopted instruments after establishing reliability and validity. The structural equation modeling (SEM) were used to assess the model fitness, hypotheses testing and to establish validity of the instruments through Confirmatory Factor Analysis (CFA). A total 220 employees of 25 firms in corporate sector were sampled through non-probability sampling technique. Path analysis revealing that HRM practices and HRIS have significant positive impact on organizational performance. The results further showed that the HRIS moderated the relationships between training, performance appraisal and organizational performance. The interpretation of the findings and limitations, theoretical and managerial implications are discussed.

Keywords: enterprise resource planning, human resource, information system, human capital

Procedia PDF Downloads 390
26893 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 104
26892 Regression Approach for Optimal Purchase of Hosts Cluster in Fixed Fund for Hadoop Big Data Platform

Authors: Haitao Yang, Jianming Lv, Fei Xu, Xintong Wang, Yilin Huang, Lanting Xia, Xuewu Zhu

Abstract:

Given a fixed fund, purchasing fewer hosts of higher capability or inversely more of lower capability is a must-be-made trade-off in practices for building a Hadoop big data platform. An exploratory study is presented for a Housing Big Data Platform project (HBDP), where typical big data computing is with SQL queries of aggregate, join, and space-time condition selections executed upon massive data from more than 10 million housing units. In HBDP, an empirical formula was introduced to predict the performance of host clusters potential for the intended typical big data computing, and it was shaped via a regression approach. With this empirical formula, it is easy to suggest an optimal cluster configuration. The investigation was based on a typical Hadoop computing ecosystem HDFS+Hive+Spark. A proper metric was raised to measure the performance of Hadoop clusters in HBDP, which was tested and compared with its predicted counterpart, on executing three kinds of typical SQL query tasks. Tests were conducted with respect to factors of CPU benchmark, memory size, virtual host division, and the number of element physical host in cluster. The research has been applied to practical cluster procurement for housing big data computing.

Keywords: Hadoop platform planning, optimal cluster scheme at fixed-fund, performance predicting formula, typical SQL query tasks

Procedia PDF Downloads 229