Search results for: WEKA data mining tool
26512 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features
Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova
Abstract:
The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.Keywords: emotion recognition, facial recognition, signal processing, machine learning
Procedia PDF Downloads 31626511 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels
Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur
Abstract:
With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography
Procedia PDF Downloads 12426510 Simo-syl: A Computer-Based Tool to Identify Language Fragilities in Italian Pre-Schoolers
Authors: Marinella Majorano, Rachele Ferrari, Tamara Bastianello
Abstract:
The recent technological advance allows for applying innovative and multimedia screen-based assessment tools to test children's language and early literacy skills, monitor their growth over the preschool years, and test their readiness for primary school. Several are the advantages that a computer-based assessment tool offers with respect to paper-based tools. Firstly, computer-based tools which provide the use of games, videos, and audio may be more motivating and engaging for children, especially for those with language difficulties. Secondly, computer-based assessments are generally less time-consuming than traditional paper-based assessments: this makes them less demanding for children and provides clinicians and researchers, but also teachers, with the opportunity to test children multiple times over the same school year and, thus, to monitor their language growth more systematically. Finally, while paper-based tools require offline coding, computer-based tools sometimes allow obtaining automatically calculated scores, thus producing less subjective evaluations of the assessed skills and provide immediate feedback. Nonetheless, using computer-based assessment tools to test meta-phonological and language skills in children is not yet common practice in Italy. The present contribution aims to estimate the internal consistency of a computer-based assessment (i.e., the Simo-syl assessment). Sixty-three Italian pre-schoolers aged between 4;10 and 5;9 years were tested at the beginning of the last year of the preschool through paper-based standardised tools in their lexical (Peabody Picture Vocabulary Test), morpho-syntactical (Grammar Repetition Test for Children), meta-phonological (Meta-Phonological skills Evaluation test), and phono-articulatory skills (non-word repetition). The same children were tested through Simo-syl assessment on their phonological and meta-phonological skills (e.g., recognise syllables and vowels and read syllables and words). The internal consistency of the computer-based tool was acceptable (Cronbach's alpha = .799). Children's scores obtained in the paper-based assessment and scores obtained in each task of the computer-based assessment were correlated. Significant and positive correlations emerged between all the tasks of the computer-based assessment and the scores obtained in the CMF (r = .287 - .311, p < .05) and in the correct sentences in the RCGB (r = .360 - .481, p < .01); non-word repetition standardised test significantly correlates with the reading tasks only (r = .329 - .350, p < .05). Further tasks should be included in the current version of Simo-syl to have a comprehensive and multi-dimensional approach when assessing children. However, such a tool represents a good chance for the teachers to early identifying language-related problems even in the school environment.Keywords: assessment, computer-based, early identification, language-related skills
Procedia PDF Downloads 18326509 The Use of Haar Wavelet Mother Signal Tool for Performance Analysis Response of Distillation Column (Application to Moroccan Case Study)
Authors: Mahacine Amrani
Abstract:
This paper aims at reviewing some Moroccan industrial applications of wavelet especially in the dynamic identification of a process model using Haar wavelet mother response. Two recent Moroccan study cases are described using dynamic data originated by a distillation column and an industrial polyethylene process plant. The purpose of the wavelet scheme is to build on-line dynamic models. In both case studies, a comparison is carried out between the Haar wavelet mother response model and a linear difference equation model. Finally it concludes, on the base of the comparison of the process performances and the best responses, which may be useful to create an estimated on-line internal model control and its application towards model-predictive controllers (MPC). All calculations were implemented using AutoSignal Software.Keywords: process performance, model, wavelets, Haar, Moroccan
Procedia PDF Downloads 31726508 Utilization of Cloud-Based Learning Platform for the Enhancement of IT Onboarding System
Authors: Christian Luarca
Abstract:
The study aims to define the efficiency of e-Trainings by the use of cloud platform as part of the onboarding process for IT support engineers. Traditional lecture based trainings involves human resource to guide and assist new hires as part of onboarding which takes time and effort. The use of electronic medium as a platform for training provides a two-way basic communication that can be done in a repetitive manner. The study focuses on determining the most efficient manner of learning the basic knowledge on IT support in the shortest time possible. This was determined by conducting the same set of knowledge transfer categories in two different approaches, one being the e-Training and the other using the traditional method. Performance assessment will be done by the use of Service Tracker Assessment (STA) Tool and Service Manager. Data gathered from this ongoing study will promote the utilization of e-Trainings in the IT onboarding process.Keywords: cloud platform, e-Training, efficiency, onboarding
Procedia PDF Downloads 15026507 Enhanced Automated Teller Machine Using Short Message Service Authentication Verification
Authors: Rasheed Gbenga Jimoh, Akinbowale Nathaniel Babatunde
Abstract:
The use of Automated Teller Machine (ATM) has become an important tool among commercial banks, customers of banks have come to depend on and trust the ATM conveniently meet their banking needs. Although the overwhelming advantages of ATM cannot be over-emphasized, its alarming fraud rate has become a bottleneck in it’s full adoption in Nigeria. This study examined the menace of ATM in the society another cost of running ATM services by banks in the country. The researcher developed a prototype of an enhanced Automated Teller Machine Authentication using Short Message Service (SMS) Verification. The developed prototype was tested by Ten (10) respondents who are users of ATM cards in the country and the data collected was analyzed using Statistical Package for Social Science (SPSS). Based on the results of the analysis, it is being envisaged that the developed prototype will go a long way in reducing the alarming rate of ATM fraud in Nigeria.Keywords: ATM, ATM fraud, e-banking, prototyping
Procedia PDF Downloads 32226506 An Improved Approach for Hybrid Rocket Injection System Design
Authors: M. Invigorito, G. Elia, M. Panelli
Abstract:
Hybrid propulsion combines beneficial properties of both solid and liquid rockets, such as multiple restarts, throttability as well as simplicity and reduced costs. A nitrous oxide (N2O)/paraffin-based hybrid rocket engine demonstrator is currently under development at the Italian Aerospace Research Center (CIRA) within the national research program HYPROB, funded by the Italian Ministry of Research. Nitrous oxide belongs to the class of self-pressurizing propellants that exhibit a high vapor pressure at standard ambient temperature. This peculiar feature makes those fluids very attractive for space rocket applications because it avoids the use of complex pressurization systems, leading to great benefits in terms of weight savings and reliability. To avoid feed-system-coupled instabilities, the phase change is required to occur through the injectors. In this regard, the oxidizer is stored in liquid condition while target chamber pressures are designed to lie below vapor pressure. The consequent cavitation and flash vaporization constitute a remarkably complex phenomenology that arises great modelling challenges. Thus, it is clear that the design of the injection system is fundamental for the full exploitation of hybrid rocket engine throttability. The Analytical Hierarchy Process has been used to select the injection architecture as best compromise among different design criteria such as functionality, technology innovation and cost. The impossibility to use engineering simplified relations for the dimensioning of the injectors led to the needs of applying a numerical approach based on OpenFOAM®. The numerical tool has been validated with selected experimental data from literature. Quantitative, as well as qualitative comparisons are performed in terms of mass flow rate and pressure drop across the injector for several operating conditions. The results show satisfactory agreement with the experimental data. Modeling assumptions, together with their impact on numerical predictions are discussed in the paper. Once assessed the reliability of the numerical tool, the injection plate has been designed and sized to guarantee the required amount of oxidizer in the combustion chamber and therefore to assure high combustion efficiency. To this purpose, the plate has been designed with multiple injectors whose number and diameter have been selected in order to reach the requested mass flow rate for the two operating conditions of maximum and minimum thrust. The overall design has been finally verified through three-dimensional computations in cavitating non-reacting conditions and it has been verified that the proposed design solution is able to guarantee the requested values of mass flow rates.Keywords: hybrid rocket, injection system design, OpenFOAM®, cavitation
Procedia PDF Downloads 21626505 Data Recording for Remote Monitoring of Autonomous Vehicles
Authors: Rong-Terng Juang
Abstract:
Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar
Procedia PDF Downloads 16326504 Digital Twin for a Floating Solar Energy System with Experimental Data Mining and AI Modelling
Authors: Danlei Yang, Luofeng Huang
Abstract:
The integration of digital twin technology with renewable energy systems offers an innovative approach to predicting and optimising performance throughout the entire lifecycle. A digital twin is a continuously updated virtual replica of a real-world entity, synchronised with data from its physical counterpart and environment. Many digital twin companies today claim to have mature digital twin products, but their focus is primarily on equipment visualisation. However, the core of a digital twin should be its model, which can mirror, shadow, and thread with the real-world entity, which is still underdeveloped. For a floating solar energy system, a digital twin model can be defined in three aspects: (a) the physical floating solar energy system along with environmental factors such as solar irradiance and wave dynamics, (b) a digital model powered by artificial intelligence (AI) algorithms, and (c) the integration of real system data with the AI-driven model and a user interface. The experimental setup for the floating solar energy system, is designed to replicate real-ocean conditions of floating solar installations within a controlled laboratory environment. The system consists of a water tank that simulates an aquatic surface, where a floating catamaran structure supports a solar panel. The solar simulator is set up in three positions: one directly above and two inclined at a 45° angle in front and behind the solar panel. This arrangement allows the simulation of different sun angles, such as sunrise, midday, and sunset. The solar simulator is positioned 400 mm away from the solar panel to maintain consistent solar irradiance on its surface. Stability for the floating structure is achieved through ropes attached to anchors at the bottom of the tank, which simulates the mooring systems used in real-world floating solar applications. The floating solar energy system's sensor setup includes various devices to monitor environmental and operational parameters. An irradiance sensor measures solar irradiance on the photovoltaic (PV) panel. Temperature sensors monitor ambient air and water temperatures, as well as the PV panel temperature. Wave gauges measure wave height, while load cells capture mooring force. Inclinometers and ultrasonic sensors record heave and pitch amplitudes of the floating system’s motions. An electric load measures the voltage and current output from the solar panel. All sensors collect data simultaneously. Artificial neural network (ANN) algorithms are central to developing the digital model, which processes historical and real-time data, identifies patterns, and predicts the system’s performance in real time. The data collected from various sensors are partly used to train the digital model, with the remaining data reserved for validation and testing. The digital twin model combines the experimental setup with the ANN model, enabling monitoring, analysis, and prediction of the floating solar energy system's operation. The digital model mirrors the functionality of the physical setup, running in sync with the experiment to provide real-time insights and predictions. It provides useful industrial benefits, such as informing maintenance plans as well as design and control strategies for optimal energy efficiency. In long term, this digital twin will help improve overall solar energy yield whilst minimising the operational costs and risks.Keywords: digital twin, floating solar energy system, experiment setup, artificial intelligence
Procedia PDF Downloads 826503 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices
Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays
Abstract:
Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.Keywords: ecological momentary assessment, real-time, stress, work
Procedia PDF Downloads 16126502 Operationalizing the Concept of Community Resilience through Community Capitals Framework-Based Index
Authors: Warda Ajaz
Abstract:
This study uses the ‘Community Capitals Framework’ (CCF) to develop a community resilience index that can serve as a useful tool for measuring resilience of communities in diverse contexts and backgrounds. CCF is an important analytical tool to assess holistic community change. This framework identifies seven major types of community capitals: natural, cultural, human, social, political, financial and built, and claims that the communities that have been successful in supporting healthy sustainable community and economic development have paid attention to all these capitals. The framework, therefore, proposes to study the community development through identification of assets in these major capitals (stock), investment in these capitals (flow), and the interaction between these capitals. Capital based approaches have been extensively used to assess community resilience, especially in the context of natural disasters and extreme events. Therefore, this study identifies key indicators for estimating each of the seven capitals through an extensive literature review and then develops an index to calculate a community resilience score. The CCF-based community resilience index presents an innovative way of operationalizing the concept of community resilience and will contribute toward decision-relevant research regarding adaptation and mitigation of community vulnerabilities to climate change-induced, as well as other adverse events.Keywords: adverse events, community capitals, community resilience, climate change, economic development, sustainability
Procedia PDF Downloads 26726501 Understanding Team Member Autonomy and Team Collaboration: A Qualitative Study
Authors: Ayşen Bakioğlu, Gökçen Seyra Çakır
Abstract:
This study aims to explore how research assistants who work in project teams experience team member autonomy and how they reconcile team member autonomy with team collaboration. The study utilizes snowball sampling. 20 research assistants who work the faculties of education in Marmara University and Yıldız Technical University have been interviewed. The analysis of data involves a content analysis MAXQDAPlus 11 which is a qualitative data analysis software is used as the data analysis tool. According to the findings of this study, emerging themes include team norm formation, team coordination management, the role of individual tasks in team collaboration, leadership distribution. According to the findings, interviewees experience team norm formation process in terms of processes, which pertain to task fulfillment, and processes, which pertain to the regulation of team dynamics. Team norm formation process instills a sense of responsibility amongst individual team members. Apart from that, the interviewees’ responses indicate that the realization of the obligation to work in a team contributes to the team norm formation process. The participants indicate that individual expectations are taken into consideration during the coordination of the team. The supervisor of the project team also has a crucial role in maintaining team collaboration. Coordination problems arise when an individual team member does not relate his/her academic field with the research topic of the project team. The findings indicate that the leadership distribution in the project teams involves two leadership processes: leadership distribution which is based on the processes that focus on individual team members and leadership distribution which is based on the processes that focus on team interaction. Apart from that, individual tasks serve as a facilitator of collaboration amongst team members. Interviewees also indicate that individual tasks also facilitate the expression of individuality.Keywords: project teams in higher education, research assistant teams, team collaboration, team member autonomy
Procedia PDF Downloads 36226500 Evaluating the ‘Assembled Educator’ of a Specialized Postgraduate Engineering Course Using Activity Theory and Genre Ecologies
Authors: Simon Winberg
Abstract:
The landscape of professional postgraduate education is changing: the focus of these programmes is moving from preparing candidates for a life in academia towards a focus of training in expert knowledge and skills to support industry. This is especially pronounced in engineering disciplines where increasingly more complex products are drawing on a depth of knowledge from multiple fields. This connects strongly with the broader notion of Industry 4.0 – where technology and society are being brought together to achieve more powerful and desirable products, but products whose inner workings also are more complex than before. The changes in what we do, and how we do it, has a profound impact on what industry would like universities to provide. One such change is the increased demand for taught doctoral and Masters programmes. These programmes aim to provide skills and training for professionals, to expand their knowledge of state-of-the-art tools and technologies. This paper investigates one such course, namely a Software Defined Radio (SDR) Master’s degree course. The teaching support for this course had to be drawn from an existing pool of academics, none of who were specialists in this field. The paper focuses on the kind of educator, a ‘hybrid academic’, assembled from available academic staff and bolstered by research. The conceptual framework for this paper combines Activity Theory and Genre Ecology. Activity Theory is used to reason about learning and interactions during the course, and Genre Ecology is used to model building and sharing of technical knowledge related to using tools and artifacts. Data were obtained from meetings with students and lecturers, logs, project reports, and course evaluations. The findings show how the course, which was initially academically-oriented, metamorphosed into a tool-dominant peer-learning structure, largely supported by the sharing of technical tool-based knowledge. While the academic staff could address gaps in the participants’ fundamental knowledge of radio systems, the participants brought with them extensive specialized knowledge and tool experience which they shared with the class. This created a complicated dynamic in the class, which centered largely on engagements with technology artifacts, such as simulators, from which knowledge was built. The course was characterized by a richness of ‘epistemic objects’, which is to say objects that had knowledge-generating qualities. A significant portion of the course curriculum had to be adapted, and the learning methods changed to accommodate the dynamic interactions that occurred during classes. This paper explains the SDR Masters course in terms of conflicts and innovations in its activity system, as well as the continually hybridizing genre ecology to show how the structuring and resource-dependence of the course transformed from its initial ‘traditional’ academic structure to a more entangled arrangement over time. It is hoped that insights from this paper would benefit other educators involved in the design and teaching of similar types of specialized professional postgraduate taught programmes.Keywords: professional postgraduate education, taught masters, engineering education, software defined radio
Procedia PDF Downloads 9226499 E-Government Development in Nigeria, 'Bank Verification No': An Anti-Corruption Tool
Authors: Ernest C. Nwadinobi, Amanda Peart, Carl Adams
Abstract:
The leading countries like the USA, UK and some of the European countries have moved their focus away from just developing the e-government platform towards just the electronic services which aim at providing access to information to its citizens or customers, but they have gone to make significant backroom changes that can accommodate this electronic service being provided to its customers or citizens. E-government has moved from just providing electronic information to citizens and customers alike to serving their needs. In developing countries like Nigeria, the enablement of e-government is being used as an anti-corruption tool. The introduction of the Bank verification number (BVN) scheme by the Central Bank of Nigeria, has helped the government in not just saving money but also protecting customer’s transaction and enhancing confidence in the banking sector. This has helped curtail the high rate of cyber and financial crime that has been part of the system. The use of BVN as an anti-corruption tool in Nigeria came at a time there was need for openness, accountability, and discipline, after years of robbing the treasury and recklessness in handling finances. As there has not been a defined method for measuring the strength or success of e-government development, in this case BVN, in Nigeria, progress will remain at the same level. The implementation strategy of the BVN in Nigeria has mostly been a quick fix, quick win solution. In fact, there is little or no indication to show evidence of a framework for e-government. Like other leading countries, there is the need for proper implementation of strategy and framework especially towards a customer orientated process, which will accommodate every administrative body of the government institution including private business rather than focusing on a non-flexible organisational structure. The development of e-government must have a strategy and framework for it to work, and this strategy must enclose every public administration and will not be limited to any individual bodies or organization. A defined framework or monitoring method must be put in place to help evaluate and benchmark government development in e-government. This framework must follow the same concept or principles. In censorious analyses of the existing methods, this paper will denote areas that must be included in the existing approach to be able to channel e-government development towards its defined strategic objectives.Keywords: Bank Verification No (BVN), quick-fix, anti-corruption, quick-win
Procedia PDF Downloads 16026498 Addressing Differentiation Using Mobile-Assisted Language Learning
Authors: Ajda Osifo, Fatma Elshafie
Abstract:
Mobile-assisted language learning favors social-constructivist and connectivist theories to learning and adaptive approaches to teaching. It offers many opportunities to differentiated instruction in meaningful ways as it enables learners to become more collaborative, engaged and independent through additional dimensions such as web-based media, virtual learning environments, online publishing to an imagined audience and digitally mediated communication. MALL applications can be a tool for the teacher to personalize and adjust instruction according to the learners’ needs and give continuous feedback to improve learning and performance in the process, which support differentiated instruction practices. This paper explores the utilization of Mobile Assisted Language Learning applications as a supporting tool for effective differentiation in the language classroom. It reports overall experience in terms of implementing MALL to shape and apply differentiated instruction and expand learning options. This session is structured in three main parts: first, a review of literature and effective practice of academically responsive instruction will be discussed. Second, samples of differentiated tasks, activities, projects and learner work will be demonstrated with relevant learning outcomes and learners’ survey results. Finally, project findings and conclusions will be given.Keywords: academically responsive instruction, differentiation, mobile learning, mobile-assisted language learning
Procedia PDF Downloads 41726497 Mitigating Acid Mine Drainage Pollution: A Case Study In the Witwatersrand Area of South Africa
Authors: Elkington Sibusiso Mnguni
Abstract:
In South Africa, mining has been a key economic sector since the discovery of gold in 1886 in the Witwatersrand region, where the city of Johannesburg is located. However, some mines have since been decommissioned, and the continuous pumping of acid mine drainage (AMD) also stopped causing the AMD to rise towards the ground surface. This posed a serious environmental risk to the groundwater resources and river systems in the region. This paper documents the development and extent of the environmental damage as well as the measures implemented by the government to alleviate such damage. The study will add to the body of knowledge on the subject of AMD treatment to prevent environmental degradation. The method used to gather and collate relevant data and information was the desktop study. The key findings include the social and environmental impact of the AMD, which include the pollution of water sources for domestic use leading to skin and other health problems and the loss of biodiversity in some areas. It was also found that the technical intervention of constructing a plant to pump and treat the AMD using the high-density sludge technology was the most effective short-term solution available while a long-term solution was being explored. Some successes and challenges experienced during the implementation of the project are also highlighted. The study will be a useful record of the current status of the AMD treatment interventions in the region.Keywords: acid mine drainage, groundwater resources, pollution, river systems, technical intervention, high density sludge
Procedia PDF Downloads 18626496 Expectation for Professionalism Effects Reality Shock: A Qualitative And Quantitative Study of Reality Shock among New Human Service Professionals
Authors: Hiromi Takafuji
Abstract:
It is a well-known fact that health care and welfare are the foundation of human activities, and human service professionals such as nurses and child care workers support these activities. COVID-19 pandemic has made the severity of the working environment in these fields even more known. It is high time to discuss the work of human service workers for the sustainable development of the human environment. Early turnover has been recognized as a long-standing issue in these fields. In Japan, the attrition rate within three years of graduation for these occupations has remained high at about 40% for more than 20 years. One of the reasons for this is Reality Shock: RS, which refers to the stress caused by the gap between pre-employment expectations and the post-employment reality experienced by new workers. The purpose of this study was to academically elucidate the mechanism of RS among human service professionals and to contribute to countermeasures against it. Firstly, to explore the structure of the relationship between professionalism and workers' RS, an exploratory interview survey was conducted and analyzed by text mining and content analysis. The results showed that the expectation of professionalism influences RS as a pre-employment job expectation. Next, the expectations of professionalism were quantified and categorized, and the responses of a total of 282 human service work professionals, nurses, child care workers, and caregivers; were finalized for data analysis. The data were analyzed using exploratory factor analysis, confirmatory factor analysis, multiple regression analysis, and structural equation modeling techniques. The results revealed that self-control orientation and authority orientation by qualification had a direct positive significant impact on RS. On the other hand, interpersonal helping orientation and altruistic orientation were found to have a direct negative significant impact and an indirect positive significant impact on RS.; we were able to clarify the structure of work expectations that affect the RS of welfare professionals, which had not been clarified in previous studies. We also explained the limitations, practical implications, and directions for future research.Keywords: human service professional, new hire turnover, SEM, reality shock
Procedia PDF Downloads 9926495 Space Time Adaptive Algorithm in Bi-Static Passive Radar Systems for Clutter Mitigation
Authors: D. Venu, N. V. Koteswara Rao
Abstract:
Space – time adaptive processing (STAP) is an effective tool for detecting a moving target in spaceborne or airborne radar systems. Since airborne passive radar systems utilize broadcast, navigation and excellent communication signals to perform various surveillance tasks and also has attracted significant interest from the distinct past, therefore the need of the hour is to have cost effective systems as compared to conventional active radar systems. Moreover, requirements of small number of secondary samples for effective clutter suppression in bi-static passive radar offer abundant illuminator resources for passive surveillance radar systems. This paper presents a framework for incorporating knowledge sources directly in the space-time beam former of airborne adaptive radars. STAP algorithm for clutter mitigation for passive bi-static radar has better quantitation of the reduction in sample size thereby amalgamating the earlier data bank with existing radar data sets. Also, we proposed a novel method to estimate the clutter matrix and perform STAP for efficient clutter suppression based on small sample size. Furthermore, the effectiveness of the proposed algorithm is verified using MATLAB simulations in order to validate STAP algorithm for passive bi-static radar. In conclusion, this study highlights the importance for various applications which augments traditional active radars using cost-effective measures.Keywords: bistatic radar, clutter, covariance matrix passive radar, STAP
Procedia PDF Downloads 29526494 A Comparative Human Rights Analysis of Deprivation of Citizenship as a Counterterrorism Instrument: An Evaluation of Belgium
Authors: Louise Reyntjens
Abstract:
In response to Islamic-inspired terrorism and the growing trend of foreign fighters, European governments are increasingly relying on the deprivation of citizenship as a security tool. This development fits within a broader securitization of immigration, where the terrorist threat is perceived as emanating from abroad. As a result, immigration law became more and more ‘securitized’. The European migration crisis has reinforced this trend. This research evaluates the deprivation of citizenship from a human rights perspective. For this, the author selected four European countries for a comparative study: Belgium, France, the United Kingdom and Sweden. All these countries face similar social and security issues, vitalizing (the debate on) deprivation of citizenship as a counterterrorism tool. Yet, they adopt a very different approach on this: The United Kingdom positions itself on the repressive side of the spectrum. Sweden on the other hand, also ‘securitized’ its immigration policy after the recent terrorist hit in Stockholm but remains on the tolerant side of the spectrum. Belgium and France are situated in between. This contribution evaluates the deprivation of citizenship in Belgium. Belgian law has provided the possibility to strip someone of their Belgian citizenship since 1919. However, the provision long remained a dead letter. The 2015 Charlie Hebdo attacks in Paris sparked a series of legislative changes, elevating the deprivation measure to a key security tool in Belgian law. Yet, the measure raises profound human rights issues. Firstly, it infringes the right to private and family life. As provided by Article 8 (2) European Court of Human Right (ECHR), this right can be limited if necessary for national security and public safety. Serious questions can however be raised about the necessity for the national security of depriving an individual of its citizenship. Behavior giving rise to this measure will generally be governed by criminal law. From a security perspective, criminal detention will thus already provide in removing the individual from society. Moreover, simply stripping an individual of its citizenship and deporting them constitutes a failure of criminal law’s responsibility to prosecute criminal behavior. Deprivation of citizenship is also discriminatory, because it differentiates, without a legitimate reason, between those liable to deprivation and those who are not. It thereby installs a secondary class of citizens, violating the European Court of Human Right’s principle that no distinction can be tolerated between children on the basis of the status of their parents. If followed by expulsion, deprivation also seriously jeopardizes the right to life and prohibition of torture. This contribution explores the human rights consequences of citizenship deprivation as a security tool in Belgium. It also offers a critical view on its efficacy for protecting national security.Keywords: Belgium, counterterrorism strategies, deprivation of citizenship, human rights, immigration law
Procedia PDF Downloads 12526493 Determinants of Utilization of Information and Communication Technology by Lecturers at Kenya Medical Training College, Nairobi
Authors: Agnes Anyango Andollo, Jane Achieng Achola
Abstract:
The use of Information and Communication Technologies (ICTs) has become one of the driving forces in facilitation of learning in most colleges. The ability to effectively harness the technology varies from college to college. The study objective was to determine the lecturers’, institutional attributes and policies that influence the utilization of ICT by the lecturers’. A cross sectional survey design was employed in order to empirically investigate the extent to which lecturers’ personal, institutional attributes and policies influence the utilization of ICT to facilitate learning. The target population of the study was 295 lecturers who facilitate learning at KMTC-Nairobi. Structured self-administered questionnaire was given to the lecturers. Quantitative data was scrutinized for completeness, accuracy and uniformity then coded. Data were analyzed in frequencies and percentages using Statistical Package for Social Sciences (SPSS) version 19, this was a reliable tool for quantitative data analysis. A total of 155 completed questionnaires administered were obtained from the respondents for the study that were subjected to analysis. The study found out that 93 (60%) of the respondents were male while 62 (40%) of the respondents were female. Individual’s educational level, age, gender and educational experience had the greatest impact on use of ICT. Lecturers’ own beliefs, values, ideas and thinking had moderate impact on use of ICT. And that institutional support by provision of resources for ICT related training such as internet, computers, laptops and projectors had moderate impact (p = 0.049) at 5% significant level on use of ICT. The study concluded that institutional attributes and ICT policy were keys to utilization of ICT by lecturers at KMTC Nairobi also mandatory policy on use of ICT by lecturers to facilitate learning was key. It recommended that policies should be put in place for Technical support to lecturers when in problem during utilization of ICT and also a mechanism should be put in place to make the use of ICT in teaching and learning mandatory.Keywords: policy, computers education, medical training institutions, ICTs
Procedia PDF Downloads 35826492 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016
Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi
Abstract:
This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.Keywords: big health data, data subject rights, GDPR, pandemic
Procedia PDF Downloads 12926491 An Approach to Apply Kernel Density Estimation Tool for Crash Prone Location Identification
Authors: Kazi Md. Shifun Newaz, S. Miaji, Shahnewaz Hazanat-E-Rabbi
Abstract:
In this study, the kernel density estimation tool has been used to identify most crash prone locations in a national highway of Bangladesh. Like other developing countries, in Bangladesh road traffic crashes (RTC) have now become a great social alarm and the situation is deteriorating day by day. Today’s black spot identification process is not based on modern technical tools and most of the cases provide wrong output. In this situation, characteristic analysis and black spot identification by spatial analysis would be an effective and low cost approach in ensuring road safety. The methodology of this study incorporates a framework on the basis of spatial-temporal study to identify most RTC occurrence locations. In this study, a very important and economic corridor like Dhaka to Sylhet highway has been chosen to apply the method. This research proposes that KDE method for identification of Hazardous Road Location (HRL) could be used for all other National highways in Bangladesh and also for other developing countries. Some recommendations have been suggested for policy maker to reduce RTC in Dhaka-Sylhet especially in black spots.Keywords: hazardous road location (HRL), crash, GIS, kernel density
Procedia PDF Downloads 31426490 Spatial Information and Urbanizing Futures
Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini
Abstract:
Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.Keywords: PPGIS, spatial information, urbanizing futures, urban planning
Procedia PDF Downloads 72626489 Modernization of Translation Studies Curriculum at Higher Education Level in Armenia
Authors: A. Vahanyan
Abstract:
The paper touches upon the problem of revision and modernization of the current curriculum on translation studies at the Armenian Higher Education Institutions (HEIs). In the contemporary world where quality and speed of services provided are mostly valued, certain higher education centers in Armenia though do not demonstrate enough flexibility in terms of the revision and amendment of courses taught. This issue is present for various curricula at the university level and Translation Studies related curriculum, in particular. Technological innovations that are of great help for translators have been long ago smoothly implemented into the global Translation Industry. According to the European Master's in Translation (EMT) framework, translation service provision comprises linguistic, intercultural, information mining, thematic, and technological competencies. Therefore, to form the competencies mentioned above, the curriculum should be seriously restructured to meet the modern education and job market requirements, relevant courses should be proposed. New courses, in particular, should focus on the formation of technological competences. These suggestions have been made upon the author’s research of the problem across various HEIs in Armenia. The updated curricula should include courses aimed at familiarization with various computer-assisted translation (CAT) tools (MemoQ, Trados, OmegaT, Wordfast, etc.) in the translation process, creation of glossaries and termbases compatible with different platforms), which will ensure consistency in translation of similar texts and speeding up the translation process itself. Another aspect that may be strengthened via curriculum modification is the introduction of interdisciplinary and Project-Based Learning courses, which will enable info mining and thematic competences, which are of great importance as well. Of course, the amendment of the existing curriculum with the mentioned courses will require corresponding faculty development via training, workshops, and seminars. Finally, the provision of extensive internship with translation agencies is strongly recommended as it will ensure the synthesis of theoretical background and practical skills highly required for the specific area. Summing up, restructuring and modernization of the existing curricula on Translation Studies should focus on three major aspects, i.e., introduction of new courses that meet the global quality standards of education, professional development for faculty, and integration of extensive internship supervised by experts in the field.Keywords: competencies, curriculum, modernization, technical literacy, translation studies
Procedia PDF Downloads 13126488 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems
Authors: Yong-Kyu Jung
Abstract:
The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity
Procedia PDF Downloads 7826487 Role of Numerical Simulation as a Tool to Enhance Climate Change Adaptation and Resilient Societies: A Case Study from the Philippines
Authors: Pankaj Kumar
Abstract:
Rapid global changes resulted in unfavorable hydrological, ecological, and environmental changes and cumulatively affected natural resources. As a result, the local communities become vulnerable to water stress, poor hygiene, the spread of diseases, food security, etc.. However, the central point for this vulnerability revolves around water resources and the way people interrelate with the hydrological system. Also, most of the efforts to minimize the adverse effect of global changes are centered on the mitigation side. Hence, countries with poor adaptive capacities and poor governance suffer most in case of disasters. However, several transdisciplinary numerical tools are well designed and are capable of answering “what-if questions” through scenario analysis using a system approach. This study has predicted the future water environment in Marikina River in the National Capital Region, Metro Manila of Philippines, using Water Evaluation and Planning (WEAP), an integrated water resource management tool. Obtained results can answer possible adaptation measures along with their associated uncertainties. It also highlighted various challenges for the policy planners to design adaptation countermeasures as well as to track the progress of achieving SDG 6.0.Keywords: water quality, Philippines, climate change adaptation, hydrological simulation, wastewater management, weap
Procedia PDF Downloads 10526486 Application of Social Media for Promoting Library and Information Services: A Case Study of Library Science Professionals of India
Authors: Payel Saha
Abstract:
Social media is playing an important role for dissemination of information in society. In 21st century most people have a smart phone and used different social media tools like Facebook, Twitter, Instagram, WhatsApp, Skype etc. in day to day life. It is rapidly growing web-based tool for everyone to share thoughts, ideas and knowledge globally using internet. The study highlights the current use of social media tools for promoting library and information services of Library and Information Professionals of India, which are working in Library. The study was conducted during November, 2017. A structured questionnaire was prepared using google docs and shared using different mailing list, sent to individual email IDs and sharing with other social media tools. Only 90 responses received from the different states of India and analyzed via MS-Excel. The data receive from 17 states and 3 union territories of India; however most of the respondents has come from the states Odisha 23, Himachal Pradesh 14 and Assam 10. The results revealed that out 90 respondents 37 Female and 53 male categories and also majority of respondents 71 have come from academic library followed by special library 15, Public library 3 and corporate library 1 respondent. The study indicates that, out of 90 respondent’s majority of 53 of respondents said that their Library have a social media account while 39 of respondents have not their Library social media account. The study also inform that Facebook, YouTube, Google+, LinkedIn, Twitter and Instagram are using by the LIS professional of India and Facebook 86 was popular social media tool among the other social media tools. Furthermore, respondent reported that they are using social media tools for sharing photos of events and programs of library 72, followed by tips for using different services 64, posting of new arrivals 56, tutorials of database 35 and send brief updates to patrons 32, announcement of library holidays 22. It was also reported by respondents that they are sharing information about scholarships training programs and marketing of library events etc. The study furthermore identify that lack of time is the major problem while using social media with 53 of respondents followed by low speed of internet 35, too many social media tools to learn 17 and some 3 respondents reported that there is no problem while using social media tools. The results also revealed that, majority of the respondents reported that they are using social media tools in daily basis 71 followed by weekly basis 16. It was followed by monthly 1 respondent and other 2 of the respondents. In summary, this study is expected to be useful in further promoting the social media for dissemination of library and information services to the general public.Keywords: application of social media, India, promoting library services, library professionals
Procedia PDF Downloads 16226485 Analytical Approach to Reinsurance in Algeria as an Emerging Market
Authors: Nesrine Bouzaher, Okba Necira
Abstract:
The financial aspect of the Algerian economy is part of all sectors that have undergone great changes these two last decades; the goal is to enable economic mechanisms for real growth. Insurance is an indispensable tool for stabilizing these mechanisms. Therefore the national economy needs to develop the insurance market in order to support the investments, externally and internally; it turns out that reinsurance is one of the area which could prove their performance in several markets mainly emerging ones. The expansion of reinsurance in the domestic market is the preoccupation of this work, focusing on factors that could enhance the demand of reinsurance in the Algerian market. This work will be based on an analytical research of the economic contribution of the reinsurance and it’s collusion with insurance; market, then it will be necessary to provide an overview of the product in the national emerging market, finally we will try to investigate on the factors that could enhance the demand in the national reinsurance market so as to determine the potential of Algeria in this area.Keywords: Algerian reinsurance data, demand trend of Algerian reinsurance, reinsurance, reinsurance market
Procedia PDF Downloads 37726484 Assessing the Resilience of the Insurance Industry under Solvency II
Authors: Vincenzo Russo, Rosella Giacometti
Abstract:
The paper aims to assess the insurance industry's resilience under Solvency II against adverse scenarios. Starting from the economic balance sheet available under Solvency II for insurance and reinsurance undertakings, we assume that assets and liabilities follow a bivariate geometric Brownian motion (GBM). Then, using the results available under Margrabe's formula, we establish an analytical solution to calibrate the volatility of the asset-liability ratio. In such a way, we can estimate the probability of default and the probability of breaching the undertaking's Solvency Capital Requirement (SCR). Furthermore, since estimating the volatility of the Solvency Ratio became crucial for insurers in light of the financial crises featured in the last decades, we introduce a novel measure that we call Resiliency Ratio. The Resiliency Ratio can be used, in addition to the Solvency Ratio, to evaluate the insurance industry's resilience in case of adverse scenarios. Finally, we introduce a simplified stress test tool to evaluate the economic balance sheet under stressed conditions. The model we propose is featured by analytical tractability and fast calibration procedure where only the disclosed data available under the Solvency II public reporting are needed for the calibration. Using the data published regularly by the European Insurance and Occupational Pensions Authority (EIOPA) in an aggregated form by country, an empirical analysis has been performed to calibrate the model and provide the related results at the country level.Keywords: Solvency II, solvency ratio, volatility of the asset-liability ratio, probability of default, probability to breach the SCR, resilience ratio, stress test
Procedia PDF Downloads 8126483 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study
Authors: Atif Zafar, Fan Haijun
Abstract:
A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.Keywords: field development plan, reservoir characterization, reservoir engineering, well test analysis
Procedia PDF Downloads 364