Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35364

Search results for: process data

34614 The Effects of Transformational Leadership on Process Innovation through Knowledge Sharing

Authors: Sawsan J. Al-Husseini, Talib A. Dosa

Abstract:

Transformational leadership has been identified as the most important factor affecting innovation and knowledge sharing; it leads to increased goal-directed behavior exhibited by followers and thus to enhanced performance and innovation for the organization. However, there is a lack of models linking transformational leadership, knowledge sharing, and process innovation within higher education (HE) institutions in general within developing countries, particularly in Iraq. This research aims to examine the mediating role of knowledge sharing in the transformational leadership and process innovation relationship. A quantitative approach was taken and 254 usable questionnaires were collected from public HE institutions in Iraq. Structural equation modelling with AMOS 22 was used to analyze the causal relationships among factors. The research found that knowledge sharing plays a pivotal role in the relationship between transformational leadership and process innovation, and that transformational leadership would be ideal in an educational context, promoting knowledge sharing activities and influencing process innovation in the public HE in Iraq. The research has developed some guidelines for researchers as well as leaders and provided evidence to support the use of TL to increase process innovation within HE environment in developing countries, particularly in Iraq.

Keywords: transformational leadership, knowledge sharing, process innovation, structural equation modelling, developing countries

Procedia PDF Downloads 324
34613 E4D-MP: Time-Lapse Multiphysics Simulation and Joint Inversion Toolset for Large-Scale Subsurface Imaging

Authors: Zhuanfang Fred Zhang, Tim C. Johnson, Yilin Fang, Chris E. Strickland

Abstract:

A variety of geophysical techniques are available to image the opaque subsurface with little or no contact with the soil. It is common to conduct time-lapse surveys of different types for a given site for improved results of subsurface imaging. Regardless of the chosen survey methods, it is often a challenge to process the massive amount of survey data. The currently available software applications are generally based on the one-dimensional assumption for a desktop personal computer. Hence, they are usually incapable of imaging the three-dimensional (3D) processes/variables in the subsurface of reasonable spatial scales; the maximum amount of data that can be inverted simultaneously is often very small due to the capability limitation of personal computers. Presently, high-performance or integrating software that enables real-time integration of multi-process geophysical methods is needed. E4D-MP enables the integration and inversion of time-lapsed large-scale data surveys from geophysical methods. Using the supercomputing capability and parallel computation algorithm, E4D-MP is capable of processing data across vast spatiotemporal scales and in near real time. The main code and the modules of E4D-MP for inverting individual or combined data sets of time-lapse 3D electrical resistivity, spectral induced polarization, and gravity surveys have been developed and demonstrated for sub-surface imaging. E4D-MP provides capability of imaging the processes (e.g., liquid or gas flow, solute transport, cavity development) and subsurface properties (e.g., rock/soil density, conductivity) critical for successful control of environmental engineering related efforts such as environmental remediation, carbon sequestration, geothermal exploration, and mine land reclamation, among others.

Keywords: gravity survey, high-performance computing, sub-surface monitoring, electrical resistivity tomography

Procedia PDF Downloads 145
34612 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act

Authors: Maria Jędrzejczak, Patryk Pieniążek

Abstract:

The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.

Keywords: data protection law, personal data, AI law, personal data breach

Procedia PDF Downloads 48
34611 Mathematical Modeling for Continuous Reactive Extrusion of Poly Lactic Acid Formation by Ring Opening Polymerization Considering Metal/Organic Catalyst and Alternative Energies

Authors: Satya P. Dubey, Hrushikesh A Abhyankar, Veronica Marchante, James L. Brighton, Björn Bergmann

Abstract:

Aims: To develop a mathematical model that simulates the ROP of PLA taking into account the effect of alternative energy to be implemented in a continuous reactive extrusion production process of PLA. Introduction: The production of large amount of waste is one of the major challenges at the present time, and polymers represent 70% of global waste. PLA has emerged as a promising polymer as it is compostable, biodegradable thermoplastic polymer made from renewable sources. However, the main limitation for the application of PLA is the traces of toxic metal catalyst in the final product. Thus, a safe and efficient production process needs to be developed to avoid the potential hazards and toxicity. It has been found that alternative energy sources (LASER, ultrasounds, microwaves) could be a prominent option to facilitate the ROP of PLA via continuous reactive extrusion. This process may result in complete extraction of the metal catalysts and facilitate less active organic catalysts. Methodology: Initial investigation were performed using the data available in literature for the reaction mechanism of ROP of PLA based on conventional metal catalyst stannous octoate. A mathematical model has been developed by considering significant parameters such as different initial concentration ratio of catalyst, co-catalyst and impurity. Effects of temperature variation and alternative energies have been implemented in the model. Results: The validation of the mathematical model has been made by using data from literature as well as actual experiments. Validation of the model including alternative energies is in progress based on experimental data for partners of the InnoREX project consortium. Conclusion: The model developed reproduces accurately the polymerisation reaction when applying alternative energy. Alternative energies have a great positive effect to increase the conversion and molecular weight of the PLA. This model could be very useful tool to complement Ludovic® software to predict the large scale production process when using reactive extrusion.

Keywords: polymer, poly-lactic acid (PLA), ring opening polymerization (ROP), metal-catalyst, bio-degradable, renewable source, alternative energy (AE)

Procedia PDF Downloads 353
34610 Teachers’ Conception of and Perception towards the New Curriculum of Ethiopian Higher Education: A Case of Debre Birhan University

Authors: Kassahun Tilahun Dessie

Abstract:

The purpose of this study was to explore the awareness of teachers and the attitude they have to the curriculum they implement as well as to assess the actual and desired magnitude of teachers' participation in curriculum development process. It also aimed at investigating the factors that affect teachers' level of conception and perception towards the new higher education curriculum. The study was carried out in Debre Birhan University. Teachers, course coordinators, team leaders and presidents were included in the study as research subjects. Teachers were proportionally selected from each department (of the six faculties) based on available sampling technique. Accordingly, a total of 103 teachers were chosen as a subject of the study. In order to collect first hand data from the teachers, a questionnaire with four parts was developed by the researcher. To this end, scales were designed for measuring the extent of teachers' awareness and attitude. Each of the scales encompasses 11 and 17 items respectively. An open ended questionnaire was also attached for the purpose of obtaining elaborated data on the issue. Information was also obtained from interviews with presidents, team leaders and course coordinators. The data obtained were analyzed qualitatively using descriptive statistical tools. The overall results of the analysis revealed that the awareness of teachers on the curriculum was low. The meager participation of teachers in the process of curriculum development and the deficiency of trainings on the concern were major factors. Teachers' perception towards the existence and implementation of the new curriculum was also inclined to the negative, though difficult to generalize. Lack of awareness, administrators poor approach and lack of facilitating appropriate incentives as well as absence of room for evaluating the curriculum etc plays big role in endangering teachers attitude while the up to datedness of the new curriculum, involvement of teachers in the curriculum development process, the wide ranging quality of the new curriculum etc laid a better ground to boost teachers attitude towards the curriculum. This may have implication to the university in that there is a need to facilitate workshops or awareness creation trainings, to have positive and cooperative administrators, and embracing committed teachers to implement the curriculum efficiently.

Keywords: conception, perception, curriculum, higher education, Ethiopia

Procedia PDF Downloads 489
34609 Information Tree: Establishment of Lifestyle-Based IT Visual Model

Authors: Chiung-Hui Chen

Abstract:

Traditional service channel is losing its edge due to emerging service technology. To establish interaction with the clients, the service industry is using effective mechanism to give clients direct access to services with emerging technologies. Thus, as service science receives attention, special and unique consumption pattern evolves; henceforth, leading to new market mechanism and influencing attitudes toward life and consumption patterns. The market demand for customized services is thus valued due to the emphasis of personal value, and is gradually changing the demand and supply relationship in the traditional industry. In respect of interior design service, in the process of traditional interior design, a designer converts to a concrete form the concept generated from the ideas and needs dictated by a user (client), by using his/her professional knowledge and drawing tool. The final product is generated through iterations of communication and modification, which is a very time-consuming process. Although this process has been accelerated with the help of computer graphics software today, repeated discussions and confirmations with users are still required to complete the task. In consideration of what is addressed above a space user’s life model is analyzed with visualization technique to create an interaction system modeled after interior design knowledge. The space user document intuitively personal life experience in a model requirement chart, allowing a researcher to analyze interrelation between analysis documents, identify the logic and the substance of data conversion. The repeated data which is documented are then transformed into design information for reuse and sharing. A professional interior designer may sort out the correlation among user’s preference, life pattern and design specification, thus deciding the critical design elements in the process of service design.

Keywords: information design, life model-based, aesthetic computing, communication

Procedia PDF Downloads 289
34608 Human Immunodeficiency Virus (HIV) Test Predictive Modeling and Identify Determinants of HIV Testing for People with Age above Fourteen Years in Ethiopia Using Data Mining Techniques: EDHS 2011

Authors: S. Abera, T. Gidey, W. Terefe

Abstract:

Introduction: Testing for HIV is the key entry point to HIV prevention, treatment, and care and support services. Hence, predictive data mining techniques can greatly benefit to analyze and discover new patterns from huge datasets like that of EDHS 2011 data. Objectives: The objective of this study is to build a predictive modeling for HIV testing and identify determinants of HIV testing for adults with age above fourteen years using data mining techniques. Methods: Cross-Industry Standard Process for Data Mining (CRISP-DM) was used to predict the model for HIV testing and explore association rules between HIV testing and the selected attributes among adult Ethiopians. Decision tree, Naïve-Bayes, logistic regression and artificial neural networks of data mining techniques were used to build the predictive models. Results: The target dataset contained 30,625 study participants; of which 16, 515 (53.9%) were women. Nearly two-fifth; 17,719 (58%), have never been tested for HIV while the rest 12,906 (42%) had been tested. Ethiopians with higher wealth index, higher educational level, belonging 20 to 29 years old, having no stigmatizing attitude towards HIV positive person, urban residents, having HIV related knowledge, information about family planning on mass media and knowing a place where to get testing for HIV showed an increased patterns with respect to HIV testing. Conclusion and Recommendation: Public health interventions should consider the identified determinants to promote people to get testing for HIV.

Keywords: data mining, HIV, testing, ethiopia

Procedia PDF Downloads 479
34607 A Comparative Analysis of the Factors Determining Improvement and Effectiveness of Mediation in Family Matters Regarding Child Protection in Australia and Poland

Authors: Beata Anna Bronowicka

Abstract:

Purpose The purpose of this paper is to improve effectiveness of mediation in family matters regarding child protection in Australia and Poland. Design/methodology/approach the methodological approach is phenomenology. Two phenomenological methods of data collection were used in this research 1/ a doctrinal research 2/an interview. The doctrinal research forms the basis for obtaining information on mediation, the date of introduction of this alternative dispute resolution method to the Australian and Polish legal systems. No less important were the analysis of the legislation and legal doctrine in the field of mediation in family matters, especially child protection. In the second method, the data was collected by semi-structured interview. The collected data was translated from Polish to English and analysed using software program. Findings- The rights of children in the context of mediation in Australia and Poland differ from the recommendations of the UN Committee on the Rights of the Child, which require that children be included in all matters that concern them. It is the room for improvement in the mediation process by increasing child rights in mediation between parents in matters related to children. Children should have the right to express their opinion similarly to the case in the court process. The challenge with mediation is also better understanding the role of professionals in mediation as lawyers, mediators. Originality/value-The research is anticipated to be of particular benefit to parents, society as whole, and professionals working in mediation. These results may also be helpful during further legislative initiatives in this area.

Keywords: mediation, family law, children's rights, australian and polish family law

Procedia PDF Downloads 66
34606 Change of Physicochemical Properties of Grain in the Germination of Chickpea Grain

Authors: Mira Zhonyssova, Nurlaym Ongarbayeva, Makpal Atykhanova

Abstract:

Indicators of quality of grain chickpeas, the absorption of water different temperatures by grain chickpeas studied. Organoleptic and physicochemical changes in the germination of chickpeas studied. The total time of the duration of germination of chickpea grain is determined. As a result of the analysis of experimental data, it was found that the germination time at which the chickpea sprout length was 0.5- 3 mm varies from 21 to 25 hours. The change in the volume of chickpea grain during germination was investigated. It was found that in the first 2 hours the volume of chickpeas changes slightly – by 38%. This is due to the process of adsorption of water to a critical state. From 2 to 9 hours, the process of swelling of chickpea grain is observed – the vital activity of cells increases, enzymatic systems become active, the respiratory coefficient increases; gibberellin, stimulating the formation of a number of enzymes, is released. During this period, there is a sharp increase in the volume of chickpea grains – up to 138%. From 9 to 19 hours, “sprouting” of chickpea grains is observed, no morphological changes occur in the corcule – the grain volume remains at 138%. From 19 hours, the grain growth process begins, while the grain volume increases by 143%.

Keywords: chickpea, seeds, legumes, germination, physic-chemical properties

Procedia PDF Downloads 44
34605 Constructing the Density of States from the Parallel Wang Landau Algorithm Overlapping Data

Authors: Arman S. Kussainov, Altynbek K. Beisekov

Abstract:

This work focuses on building an efficient universal procedure to construct a single density of states from the multiple pieces of data provided by the parallel implementation of the Wang Landau Monte Carlo based algorithm. The Ising and Pott models were used as the examples of the two-dimensional spin lattices to construct their densities of states. Sampled energy space was distributed between the individual walkers with certain overlaps. This was made to include the latest development of the algorithm as the density of states replica exchange technique. Several factors of immediate importance for the seamless stitching process have being considered. These include but not limited to the speed and universality of the initial parallel algorithm implementation as well as the data post-processing to produce the expected smooth density of states.

Keywords: density of states, Monte Carlo, parallel algorithm, Wang Landau algorithm

Procedia PDF Downloads 394
34604 Adaptive Auth - Adaptive Authentication Based on User Attributes for Web Application

Authors: Senthuran Manoharan, Rathesan Sivagananalingam

Abstract:

One of the main issues in system security is Authentication. Authentication can be defined as the process of recognizing the user's identity and it is the most important step in the access control process to safeguard data/resources from being accessed by unauthorized users. The static method of authentication cannot ensure the genuineness of the user. Due to this reason, more innovative authentication mechanisms came into play. At first two factor authentication was introduced and later, multi-factor authentication was introduced to enhance the security of the system. It also had some issues and later, adaptive authentication was introduced. In this research paper, the design of an adaptive authentication engine was put forward. The user risk profile was calculated based on the user parameters and then the user was challenged with a suitable authentication method.

Keywords: authentication, adaptive authentication, machine learning, security

Procedia PDF Downloads 227
34603 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data

Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin

Abstract:

The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.

Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline

Procedia PDF Downloads 300
34602 Efficiency of Storehouse Management: Case Study of Faculty of Management Science, Suan Sunandha Rajabhat University

Authors: Thidarath Rungruangchaikongmi, Duangsamorn Rungsawanpho

Abstract:

This research aims to investigate the efficiency of storehouse management and collect problems of the process of storehouse work of Faculty of Management Science, Suan Sunandha Rajabhat University. The subjects consisting of head of storehouse section and staffs, sampled through the Convenience Sampling Technique for 97 sampling were included in the study and the Content Analysis technique was used in analysis of data. The results of the study revealed that the management efficiency of the storehouse work on the part of work process was found to be relevant to university’s rules and regulations. The delay of work in particular steps had occurred due to more rules and regulations or practice guidelines were issued for work transparency and fast and easy inspection and control. The key problem of the management of storehouse work fell on the lack of knowledge and understanding regarding university’s rules and regulations or practice guidelines of the officers.

Keywords: efficiency of storehouse management, faculty of management science, process of storehouse work, Suan Sunandha Rajabhat University

Procedia PDF Downloads 289
34601 Empirical Study of Running Correlations in Exam Marks: Same Statistical Pattern as Chance

Authors: Weisi Guo

Abstract:

It is well established that there may be running correlations in sequential exam marks due to students sitting in the order of course registration patterns. As such, a random and non-sequential sampling of exam marks is a standard recommended practice. Here, the paper examines a large number of exam data stretching several years across different modules to see the degree to which it is true. Using the real mark distribution as a generative process, it was found that random simulated data had no more sequential randomness than the real data. That is to say, the running correlations that one often observes are statistically identical to chance. Digging deeper, it was found that some high running correlations have students that indeed share a common course history and make similar mistakes. However, at the statistical scale of a module question, the combined effect is statistically similar to the random shuffling of papers. As such, there may not be the need to take random samples for marks, but it still remains good practice to mark papers in a random sequence to reduce the repetitive marking bias and errors.

Keywords: data analysis, empirical study, exams, marking

Procedia PDF Downloads 169
34600 Application Potential of Forward Osmosis-Nanofiltration Hybrid Process for the Treatment of Mining Waste Water

Authors: Ketan Mahawer, Abeer Mutto, S. K. Gupta

Abstract:

The mining wastewater contains inorganic metal salts, which makes it saline and additionally contributes to contaminating the surface and underground freshwater reserves that exist nearby mineral processing industries. Therefore, treatment of wastewater and water recovery is obligatory by any available technology before disposing it into the environment. Currently, reverse osmosis (RO) is the commercially acceptable conventional membrane process for saline wastewater treatment, but consumes an enormous amount of energy and makes the process expensive. To solve this industrial problem with minimum energy consumption, we tested the feasibility of forward osmosis-nanofiltration (FO-NF) hybrid process for the mining wastewater treatment. The FO-NF process experimental results for 0.029M concentration of saline wastewater treated by 0.42 M sodium-sulfate based draw solution shows that specific energy consumption of the FO-NF process compared with standalone NF was slightly above (between 0.5-1 kWh/m3) from conventional process. However, average freshwater recovery was 30% more from standalone NF with same feed and operating conditions. Hence, FO-NF process in place of RO/NF offers a huge possibility for treating mining industry wastewater and concentrates the metals as the by-products without consuming an excessive/large amount of energy and in addition, mitigates the fouling in long periods of treatment, which also decreases the maintenance and replacement cost of the separation process.

Keywords: forward osmosis, nanofiltration, mining, draw solution, divalent solute

Procedia PDF Downloads 109
34599 Electron Beam Melting Process Parameter Optimization Using Multi Objective Reinforcement Learning

Authors: Michael A. Sprayberry, Vincent C. Paquit

Abstract:

Process parameter optimization in metal powder bed electron beam melting (MPBEBM) is crucial to ensure the technology's repeatability, control, and industry-continued adoption. Despite continued efforts to address the challenges via the traditional design of experiments and process mapping techniques, there needs to be more successful in an on-the-fly optimization framework that can be adapted to MPBEBM systems. Additionally, data-intensive physics-based modeling and simulation methods are difficult to support by a metal AM alloy or system due to cost restrictions. To mitigate the challenge of resource-intensive experiments and models, this paper introduces a Multi-Objective Reinforcement Learning (MORL) methodology defined as an optimization problem for MPBEBM. An off-policy MORL framework based on policy gradient is proposed to discover optimal sets of beam power (P) – beam velocity (v) combinations to maintain a steady-state melt pool depth and phase transformation. For this, an experimentally validated Eagar-Tsai melt pool model is used to simulate the MPBEBM environment, where the beam acts as the agent across the P – v space to maximize returns for the uncertain powder bed environment producing a melt pool and phase transformation closer to the optimum. The culmination of the training process yields a set of process parameters {power, speed, hatch spacing, layer depth, and preheat} where the state (P,v) with the highest returns corresponds to a refined process parameter mapping. The resultant objects and mapping of returns to the P-v space show convergence with experimental observations. The framework, therefore, provides a model-free multi-objective approach to discovery without the need for trial-and-error experiments.

Keywords: additive manufacturing, metal powder bed fusion, reinforcement learning, process parameter optimization

Procedia PDF Downloads 83
34598 A Case Study of An Artist Diagnosed with Schizophrenia-Using the Graphic Rorschach (Digital version) “GRD”

Authors: Maiko Kiyohara, Toshiki Ito

Abstract:

In this study, we used a psychotherapy process for patient with dissociative disorder and the graphic Rorschach (Digital version) (GRD). A dissociative disorder is a type of dissociation characterized by multiple alternating personalities (also called alternate identity or another identity). "dissociation" is a state in which consciousness, memory, thinking, emotion, perception, behavior, body image, and so on are divided and experienced. Dissociation symptoms, such as lack of memory, are seen, and the repetition of blanks in daily events causes serious problems in life. Although the pathological mechanism of dissociation has not yet been fully elucidated, it is said that it is caused by childhood abuse or shocking trauma. In case of Japan, no reliable data has been reported on the number of patients and prevalence of dissociative disorders, no drug is compatible with dissociation symptoms, and no clear treatment has been established. GRD is a method that the author revised in 2017 to a Graphic Rorschach, which is a special technique for subjects to draw language responses when enforce Rorschach. GRD reduces the burden on both the subject and the examiner, reduces the complexity of organizing data, improves the simplicity of organizing data, and improves the accuracy of interpretation by introducing a tablet computer during the drawing reaction. We are conducting research for the purpose. The patient in this case is a woman in her 50s, and has multiple personalities since childhood. At present, there are about 10 personalities whose main personality is just grasped. The patients is raising her junior high school sons as single parent, but personal changes often occur at home, which makes the home environment inferior and economically oppressive, and has severely hindered daily life. In psychotherapy, while a personality different from the main personality has appeared, I have also conducted psychotherapy with her son. In this case, the psychotherapy process and the GRD were performed to understand the personality characteristics, and the possibility of therapeutic significance to personality integration is reported.

Keywords: GRD, dissociative disorder, a case study of psychotherapy process, dissociation

Procedia PDF Downloads 104
34597 Implementing Delivery Drones in Logistics Business Process: Case of Pharmaceutical Industry

Authors: Nikola Vlahovic, Blazenka Knezevic, Petra Batalic

Abstract:

In this paper, we will present a research about feasibility of implementing unmanned aerial vehicles, also known as 'drones', in logistics. Research is based on available information about current incentives and experiments in application of delivery drones in commercial use. Overview of current pilot projects and literature, as well as an overview of detected challenges, will be compiled and presented. Based on these findings, we will present a conceptual model of business process that implements delivery drones in business to business logistic operations. Business scenario is based on a pharmaceutical supply chain. Simulation modeling will be used to create models for running experiments and collecting performance data. Comparative study of the presented conceptual model will be given. The work will outline the main advantages and disadvantages of implementing unmanned aerial vehicles in delivery services as a supplementary distribution channel along the supply chain.

Keywords: business process, delivery drones, logistics, simulation modelling, unmanned aerial vehicles

Procedia PDF Downloads 383
34596 The Lethal Autonomy and Military Targeting Process

Authors: Serdal Akyüz, Halit Turan, Mehmet Öztürk

Abstract:

The future security environment will have new battlefield and enemies. The boundaries of battlefield and the identity of enemies cannot be noticed easily. The politicians may not want to lose their soldiers in very risky operations. This approach will pave the way for smart machines like war robots and new drones. These machines will have the decision-making ability and act simultaneously. This ability can change the military targeting process. Military targeting process (MTP) benefits from a wide scope of lethal and non-lethal weapons to reach an intended end-state. This process is now managed by people but in the future smart machines can do it by themselves. At first sight, this development seems useful for humanity owing to decrease the casualties in war. Using robots -which can decide, detect, deliver and asses without human support- for homeland security and against terrorist has very crucial risks and threats. Besides, it can decrease the havoc but also increase the collateral damages. This paper examines the current use of smart war machines, military targeting process and presents a new approach to MTP from lethal autonomy concept's point of view.

Keywords: the autonomous weapon systems, the lethal autonomy, military targeting process (MTP)

Procedia PDF Downloads 418
34595 Contextual Sentiment Analysis with Untrained Annotators

Authors: Lucas A. Silva, Carla R. Aguiar

Abstract:

This work presents a proposal to perform contextual sentiment analysis using a supervised learning algorithm and disregarding the extensive training of annotators. To achieve this goal, a web platform was developed to perform the entire procedure outlined in this paper. The main contribution of the pipeline described in this article is to simplify and automate the annotation process through a system of analysis of congruence between the notes. This ensured satisfactory results even without using specialized annotators in the context of the research, avoiding the generation of biased training data for the classifiers. For this, a case study was conducted in a blog of entrepreneurship. The experimental results were consistent with the literature related annotation using formalized process with experts.

Keywords: sentiment analysis, untrained annotators, naive bayes, entrepreneurship, contextualized classifier

Procedia PDF Downloads 381
34594 Science Communication: A Possible Dialogue between Researchers and Agribusiness Farmers

Authors: Cristiane Hengler Corrêa Bernardo

Abstract:

The communication is an essential part of the process that characterizes scientific research. It should be present in every stage of research in a systemic way. However, this process is not always efficient and effective. Reports of researchers focused on agribusiness point to difficulties in communicating with farmers that negatively impact on research results and may cause distortions and even quite significant inconsistencies. This research aims at identifying the main noise and barriers in communication between agribusiness researchers and farmers. It discusses the possibility of creating a specific strategy to correct or minimize such failures. The main research question: what features of the communication process will be decisive for the communication between agribusiness researcher and farmer occur with greater efficiency? It is expected that the research will result in processes that may correct or minimize such problems, promoting dialogues more efficient knowledge. The research will adopt a qualitative approach, using action research as a form of investigative action for social and educational nature, aiming at promoting understanding and interaction between researchers and members of the investigated situations. To collect and analyze data to document analysis will be used; questionnaires and interviews and content analysis.

Keywords: agribusiness farmers, researchers, science communication, analysis

Procedia PDF Downloads 267
34593 Academic Success, Problem-Based Learning and the Middleman: The Community Voice

Authors: Isabel Medina, Mario Duran

Abstract:

Although Problem-based learning provides students with multiple opportunities for rigorous instructional experiences in which students are challenged to address problems in the community; there are still gaps in connecting community leaders to the PBL process. At a south Texas high school, community participation serves as an integral component of the PBL process. Problem-based learning (PBL) has recently gained momentum due to the increase in global communities that value collaboration and critical thinking. As an instructional approach, PBL engages high school students in meaningful learning experiences. Furthermore, PBL focuses on providing students with a connection to real-world situations that require effective peer collaboration. For PBL leaders, providing students with a meaningful process is as important as the final PBL outcome. To achieve this goal, STEM high school strategically created a space for community involvement to be woven within the PBL fabric. This study examines the impact community members had on PBL students attending a STEM high school in South Texas. At STEM High School, community members represent a support system that works through the PBL process to ensure students receive real-life mentoring from business and industry leaders situated in the community. A phenomenological study using a semi-structured approach was used to collect data about students’ perception of community involvement within the PBL process for one South Texas high school. In our proposed presentation, we will discuss how community involvement in the PBL process academically impacted the educational experience of high school students at STEM high school. We address the instructional concerns PBL critics have with the lack of direct instruction, by providing a representation of how STEM high school utilizes community members to assist in impacting the academic experience of students.

Keywords: phenomenological, STEM education, student engagement, community involvement

Procedia PDF Downloads 83
34592 A Comprehensive Survey and Improvement to Existing Privacy Preserving Data Mining Techniques

Authors: Tosin Ige

Abstract:

Ethics must be a condition of the world, like logic. (Ludwig Wittgenstein, 1889-1951). As important as data mining is, it possess a significant threat to ethics, privacy, and legality, since data mining makes it difficult for an individual or consumer (in the case of a company) to control the accessibility and usage of his data. This research focuses on Current issues and the latest research and development on Privacy preserving data mining methods as at year 2022. It also discusses some advances in those techniques while at the same time highlighting and providing a new technique as a solution to an existing technique of privacy preserving data mining methods. This paper also bridges the wide gap between Data mining and the Web Application Programing Interface (web API), where research is urgently needed for an added layer of security in data mining while at the same time introducing a seamless and more efficient way of data mining.

Keywords: data, privacy, data mining, association rule, privacy preserving, mining technique

Procedia PDF Downloads 153
34591 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: big data, big data analytics, Hadoop, cloud

Procedia PDF Downloads 295
34590 Outcomes of Pain Management for Patients in Srinagarind Hospital: Acute Pain Indicator

Authors: Chalermsri Sorasit, Siriporn Mongkhonthawornchai, Darawan Augsornwan, Sudthanom Kamollirt

Abstract:

Background: Although knowledge of pain and pain management is improving, they are still inadequate to patients. The Nursing Division of Srinagarind Hospital is responsible for setting the pain management system, including work instruction development and pain management indicators. We have developed an information technology program for monitoring pain quality indicators, which was implemented to all nursing departments in April 2013. Objective: To study outcomes of acute pain management in process and outcome indicators. Method: This is a retrospective descriptive study. The sample population was patients who had acute pain 24-48 hours after receiving a procedure, while admitted to Srinagarind Hospital in 2014. Data were collected from the information technology program. 2709 patients with acute pain from 10 Nursing Departments were recruited in the study. The research tools in this study were 1) the demographic questionnaire 2) the pain management questionnaire for process indicators, and 3) the pain management questionnaire for outcome indicators. Data were analyzed and presented by percentages and means. Results: The process indicators show that nurses used pain assessment tool and recorded 99.19%. The pain reassessment after the intervention was 96.09%. The 80.15% of the patients received opioid for pain medication and the most frequency of non-pharmacological intervention used was positioning (76.72%). For the outcome indicators, nearly half of them (49.90%) had moderate–severe pain, mean scores of worst pain was 6.48 and overall pain was 4.08. Patient satisfaction level with pain management was good (49.17%) and very good (46.62%). Conclusion: Nurses used pain assessment tools and pain documents which met the goal of the pain management process. Patient satisfaction with pain management was at high level. However the patients had still moderate to severe pain. Nurses should adhere more strictly to the guidelines of pain management, by using acute pain guidelines especially when pain intensity is particularly moderate-high. Nurses should also develop and practice a non-pharmacological pain management program to continually improve the quality of pain management. The information technology program should have more details about non-pharmacological pain techniques.

Keywords: outcome, pain management, acute pain, Srinagarind Hospital

Procedia PDF Downloads 218
34589 System for Monitoring Marine Turtles Using Unstructured Supplementary Service Data

Authors: Luís Pina

Abstract:

The conservation of marine biodiversity keeps ecosystems in balance and ensures the sustainable use of resources. In this context, technological resources have been used for monitoring marine species to allow biologists to obtain data in real-time. There are different mobile applications developed for data collection for monitoring purposes, but these systems are designed to be utilized only on third-generation (3G) phones or smartphones with Internet access and in rural parts of the developing countries, Internet services and smartphones are scarce. Thus, the objective of this work is to develop a system to monitor marine turtles using Unstructured Supplementary Service Data (USSD), which users can access through basic mobile phones. The system aims to improve the data collection mechanism and enhance the effectiveness of current systems in monitoring sea turtles using any type of mobile device without Internet access. The system will be able to report information related to the biological activities of marine turtles. Also, it will be used as a platform to assist marine conservation entities to receive reports of illegal sales of sea turtles. The system can also be utilized as an educational tool for communities, providing knowledge and allowing the inclusion of communities in the process of monitoring marine turtles. Therefore, this work may contribute with information to decision-making and implementation of contingency plans for marine conservation programs.

Keywords: GSM, marine biology, marine turtles, unstructured supplementary service data (USSD)

Procedia PDF Downloads 195
34588 Semantic Data Schema Recognition

Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia

Abstract:

The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.

Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns

Procedia PDF Downloads 410
34587 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means

Procedia PDF Downloads 248
34586 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: block matching, digital evidence, hash list, evaluation of digital evidence

Procedia PDF Downloads 244
34585 Access Control System for Big Data Application

Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud

Abstract:

Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.

Keywords: access control, security, Big Data, domain

Procedia PDF Downloads 121