Search results for: data analyses
25721 The Importance of Knowledge Innovation for External Audit on Anti-Corruption
Authors: Adel M. Qatawneh
Abstract:
This paper aimed to determine the importance of knowledge innovation for external audit on anti-corruption in the entire Jordanian bank companies are listed in Amman Stock Exchange (ASE). The study importance arises from the need to recognize the Knowledge innovation for external audit and anti-corruption as the development in the world of business, the variables that will be affected by external audit innovation are: reliability of financial data, relevantly of financial data, consistency of the financial data, Full disclosure of financial data and protecting the rights of investors to achieve the objectives of the study a questionnaire was designed and distributed to the society of the Jordanian bank are listed in Amman Stock Exchange. The data analysis found out that the banks in Jordan have a positive importance of Knowledge innovation for external audit on anti-corruption. They agree on the benefit of Knowledge innovation for external audit on anti-corruption. The statistical analysis showed that Knowledge innovation for external audit had a positive impact on the anti-corruption and that external audit has a significantly statistical relationship with anti-corruption, reliability of financial data, consistency of the financial data, a full disclosure of financial data and protecting the rights of investors.Keywords: knowledge innovation, external audit, anti-corruption, Amman Stock Exchange
Procedia PDF Downloads 46525720 Automated End-to-End Pipeline Processing Solution for Autonomous Driving
Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi
Abstract:
Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing
Procedia PDF Downloads 12325719 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues
Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid
Abstract:
New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.Keywords: information visualization, visual analytics, text mining, visual text analytics tools, big data visualization
Procedia PDF Downloads 39925718 Churn Prediction for Telecommunication Industry Using Artificial Neural Networks
Authors: Ulas Vural, M. Ergun Okay, E. Mesut Yildiz
Abstract:
Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.Keywords: customer relationship management, churn prediction, telecom industry, deep learning, artificial neural networks
Procedia PDF Downloads 14625717 The Face Sync-Smart Attendance
Authors: Bekkem Chakradhar Reddy, Y. Soni Priya, Mathivanan G., L. K. Joshila Grace, N. Srinivasan, Asha P.
Abstract:
Currently, there are a lot of problems related to marking attendance in schools, offices, or other places. Organizations tasked with collecting daily attendance data have numerous concerns. There are different ways to mark attendance. The most commonly used method is collecting data manually by calling each student. It is a longer process and problematic. Now, there are a lot of new technologies that help to mark attendance automatically. It reduces work and records the data. We have proposed to implement attendance marking using the latest technologies. We have implemented a system based on face identification and analyzing faces. The project is developed by gathering faces and analyzing data, using deep learning algorithms to recognize faces effectively. The data is recorded and forwarded to the host through mail. The project was implemented in Python and Python libraries used are CV2, Face Recognition, and Smtplib.Keywords: python, deep learning, face recognition, CV2, smtplib, Dlib.
Procedia PDF Downloads 5825716 Modeling and Simulation of Secondary Breakup and Its Influence on Fuel Spray in High Torque Low Speed Diesel Engine
Authors: Mohsin Raza, Rizwan Latif, Syed Adnan Qasim, Imran Shafi
Abstract:
High torque low-speed diesel engine has a wide range of industrial and commercial applications. In literature, it’s found that lot of work has been done for the high-speed diesel engine and research on High Torque low-speed is rare. The fuel injection plays a key role in the efficiency of engine and reduction in exhaust emission. The fuel breakup plays a critical role in air-fuel mixture and spray combustion. The current study explains numerically an important phenomenon in spray combustion which is deformation and breakup of liquid drops in compression ignition internal combustion engine. The secondary breakup and its influence on spray and characteristics of compressed gas in-cylinder have been calculated by using simulation software in the backdrop of high torque low-speed diesel like conditions. The secondary spray breakup is modeled with KH - RT instabilities. The continuous field is described by turbulence model and dynamics of the dispersed droplet is modeled by Lagrangian tracking scheme. The results by using KH - RT model are compared against other default methods in OpenFOAM and published experimental data from research and implemented in CFD (Computational Fluid Dynamics). These numerical simulation, done in OpenFoam and Matlab, results are analyzed for the complete 720- degree 4 stroke engine cycle at a low engine speed, for favorable agreement to be achieved. Results thus obtained will be analyzed for better evaporation in near nozzle region. The proposed analyses will further help in better engine efficiency, low emission and improved fuel economy.Keywords: diesel fuel, KH-RT, Lagrangian , Open FOAM, secondary breakup
Procedia PDF Downloads 26525715 Geographical Data Visualization Using Video Games Technologies
Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava
Abstract:
In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material
Procedia PDF Downloads 24625714 Mechanisms Underlying Comprehension of Visualized Personal Health Information: An Eye Tracking Study
Authors: Da Tao, Mingfu Qin, Wenkai Li, Tieyan Wang
Abstract:
While the use of electronic personal health portals has gained increasing popularity in the healthcare industry, users usually experience difficulty in comprehending and correctly responding to personal health information, partly due to inappropriate or poor presentation of the information. The way personal health information is visualized may affect how users perceive and assess their personal health information. This study was conducted to examine the effects of information visualization format and visualization mode on the comprehension and perceptions of personal health information among personal health information users with eye tracking techniques. A two-factor within-subjects experimental design was employed, where participants were instructed to complete a series of personal health information comprehension tasks under varied types of visualization mode (i.e., whether the information visualization is static or dynamic) and three visualization formats (i.e., bar graph, instrument-like graph, and text-only format). Data on a set of measures, including comprehension performance, perceptions, and eye movement indicators, were collected during the task completion in the experiment. Repeated measure analysis of variance analyses (RM-ANOVAs) was used for data analysis. The results showed that while the visualization format yielded no effects on comprehension performance, it significantly affected users’ perceptions (such as perceived ease of use and satisfaction). The two graphic visualizations yielded significantly higher favorable scores on subjective evaluations than that of the text format. While visualization mode showed no effects on users’ perception measures, it significantly affected users' comprehension performance in that dynamic visualization significantly reduced users' information search time. Both visualization format and visualization mode had significant main effects on eye movement behaviors, and their interaction effects were also significant. While the bar graph format and text format had similar time to first fixation across dynamic and static visualizations, instrument-like graph format had a larger time to first fixation for dynamic visualization than for static visualization. The two graphic visualization formats yielded shorter total fixation duration compared with the text-only format, indicating their ability to improve information comprehension efficiency. The results suggest that dynamic visualization can improve efficiency in comprehending important health information, and graphic visualization formats were favored more by users. The findings are helpful in the underlying comprehension mechanism of visualized personal health information and provide important implications for optimal design and visualization of personal health information.Keywords: eye tracking, information comprehension, personal health information, visualization
Procedia PDF Downloads 10925713 Adsorption of Heavy Metals Using Chemically-Modified Tea Leaves
Authors: Phillip Ahn, Bryan Kim
Abstract:
Copper is perhaps the most prevalent heavy metal used in the manufacturing industries, from food additives to metal-mechanic factories. Common methodologies to remove copper are expensive and produce undesired by-products. A good decontaminating candidate should be environment-friendly, inexpensive, and capable of eliminating low concentrations of the metal. This work suggests chemically modified spent tea leaves of chamomile, peppermint and green tea in their thiolated, sulfonated and carboxylated forms as candidates for the removal of copper from solutions. Batch experiments were conducted to maximize the adsorption of copper (II) ions. Effects such as acidity, salinity, adsorbent dose, metal concentration, and presence of surfactant were explored. Experimental data show that maximum adsorption is reached at neutral pH. The results indicate that Cu(II) can be removed up to 53%, 22% and 19% with the thiolated, carboxylated and sulfonated adsorbents, respectively. Maximum adsorption of copper on TPM (53%) is achieved with 150 mg and decreases with the presence of salts and surfactants. Conversely, sulfonated and carboxylated adsorbents show better adsorption in the presence of surfactants. Time-dependent experiments show that adsorption is reached in less than 25 min for TCM and 5 min for SCM. Instrumental analyses determined the presence of active functional groups, thermal resistance, and scanning electron microscopy, indicating that both adsorbents are promising materials for the selective recovery and treatment of metal ions from wastewaters. Finally, columns were prepared with these adsorbents to explore their application in scaled-up processes, with very positive results. A long-term goal involves the recycling of the exhausted adsorbent and/or their use in the preparation of biofuels due to changes in materials’ structures.Keywords: heavy metal removal, adsorption, wastewaters, water remediation
Procedia PDF Downloads 29025712 Nonparametric Sieve Estimation with Dependent Data: Application to Deep Neural Networks
Authors: Chad Brown
Abstract:
This paper establishes general conditions for the convergence rates of nonparametric sieve estimators with dependent data. We present two key results: one for nonstationary data and another for stationary mixing data. Previous theoretical results often lack practical applicability to deep neural networks (DNNs). Using these conditions, we derive convergence rates for DNN sieve estimators in nonparametric regression settings with both nonstationary and stationary mixing data. The DNN architectures considered adhere to current industry standards, featuring fully connected feedforward networks with rectified linear unit activation functions, unbounded weights, and a width and depth that grows with sample size.Keywords: sieve extremum estimates, nonparametric estimation, deep learning, neural networks, rectified linear unit, nonstationary processes
Procedia PDF Downloads 4125711 An Evaluation of Neuropsychiatric Manifestations in Systemic Lupus Erythematosus Patients in Saudi Arabia and Their Associated Factors
Authors: Yousef M. Alammari, Mahmoud A. Gaddoury, Reem A. Almohaini, Sara A. Alharbi, Lena S. Alsaleem, Lujain H. Allowaihiq, Maha H. Alrashid, Abdullah H. Alghamdi, Abdullah A. Alaryni
Abstract:
Objective: The goal of this study was to establish the prevalence of neuropsychiatric symptoms in systemic lupus erythematosus (NPSLE) patients in Saudi Arabia and the variables that are linked to it. Methods: During June 2021, this cross-sectional study was carried out among SLE patients in Saudi Arabia. The Saudi Rheumatism Association exploited social media platforms to provide a self-administered online questionnaire to SLE patients. All data analyses were performed using the Statistical Packages for Social Sciences (SPSS) version 26. Results: Two hundred and five SLE patients participated in the study (females 91.3 % vs. males 8.7 %). In addition, 13.5 % of patients had a family history of SLE, and 26% had SLE for one to three years. Alteration or loss of sensation (53.4%), Fear (52.4%), and headache (48.1%) were the most prevalent signs of neuropsychiatric symptoms in systemic lupus erythematosus (NPSLE) patients. The prevalence of patients with NPSLE was 40%. In a multivariate regression model, fear, altered sensations, cerebrovascular illness, sleep disruption, and diminished interest in routine activities were identified as independent risk variables for NPSLE. Conclusion: Nearly half of SLE patients demonstrated NP manifestations, with significant symptoms including fear, alteration of sensation, cerebrovascular disease, sleep disturbance, and reduced interest in normal activities. To detect the pathophysiology of NPSLE, it is necessary to understand the relationship between neuropsychiatric morbidity and other relevant rheumatic disorders in the SLE population.Keywords: neuropsychiatric, systemic lupus erythematosus, NPSLE, prevalence, SLE patients
Procedia PDF Downloads 7525710 Development of Risk Management System for Urban Railroad Underground Structures and Surrounding Ground
Authors: Y. K. Park, B. K. Kim, J. W. Lee, S. J. Lee
Abstract:
To assess the risk of the underground structures and surrounding ground, we collect basic data by the engineering method of measurement, exploration and surveys and, derive the risk through proper analysis and each assessment for urban railroad underground structures and surrounding ground including station inflow. Basic data are obtained by the fiber-optic sensors, MEMS sensors, water quantity/quality sensors, tunnel scanner, ground penetrating radar, light weight deflectometer, and are evaluated if they are more than the proper value or not. Based on these data, we analyze the risk level of urban railroad underground structures and surrounding ground. And we develop the risk management system to manage efficiently these data and to support a convenient interface environment at input/output of data.Keywords: urban railroad, underground structures, ground subsidence, station inflow, risk
Procedia PDF Downloads 33625709 Evaluation of the Physico-Chemical and Microbial Properties of the Compost Leachate (CL) to Assess Its Role in the Bioremediation of Polyaromatic Hydrocarbons (PAHs)
Authors: Omaima A. Sharaf, Tarek A. Moussa, Said M. Badr El-Din, H. Moawad
Abstract:
Background: Polycyclic aromatic hydrocarbons (PAHs) pose great environmental and human health concerns for their widespread occurrence, persistence, and carcinogenic properties. PAHs releases due to anthropogenic activities to the wider environment have led to higher concentrations of these contaminants than would be expected from natural processes alone. This may result in a wide range of environmental problems that can accumulate in agricultural ecosystems, which threatened to become a negative impact on sustainable agricultural development. Thus, this study aimed to evaluate the physico-chemical, and microbial properties of the compost leachate (CL) to assess its role as nutrient and microbial source (biostimulation/bioaugmentation) for developing a cost-effective bioremediation technology for PAHs contaminated sites. Material and Methods: PAHs-degrading bacteria were isolated from CL that was collected from a composting site located in central Scotland, UK. Isolation was carried out by enrichment using phenanthrene (PHR), pyrene (PYR) and benzo(a)pyrene (BaP) as the sole source of carbon and energy. The isolates were characterized using a variety of phenotypic and molecular properties. Six different isolates were identified based on the difference in morphological and biochemical tests. The efficiency of these isolates in PAHs utilization was assessed. Further analysis was performed to define taxonomical status and phylogenic relation between the most potent PAHs-utilizing bacterial strains and other standard strains, using molecular approach by partial 16S rDNA gene sequence analysis. Results indicated that the 16S rDNA sequence analysis confirmed the results of biochemical identification, as both of biochemical and molecular identification of the isolates assigned them to Bacillus licheniformis, Pseudomonas aeruginosa, Alcaligenes faecalis, Serratia marcescens, Enterobacter cloacae and Providenicia which were identified as the prominent PAHs-utilizers isolated from CL. Conclusion: This study indicates that the CL samples contain a diverse population of PAHs-degrading bacteria and the use of CL may have a potential for bioremediation of PAHs contaminated sites.Keywords: polycyclic aromatic hydrocarbons, physico-chemical analyses, compost leachate, microbial and biochemical analyses, phylogenic relations, 16S rDNA sequence analysis
Procedia PDF Downloads 26325708 The Influence of Service Quality on Customer Satisfaction and Customer Loyalty at a Telecommunication Company in Malaysia
Authors: Noor Azlina Mohamed Yunus, Baharom Abd Rahman, Abdul Kadir Othman, Narehan Hassan, Rohana Mat Som, Ibhrahim Zakaria
Abstract:
Customer satisfaction and customer loyalty are the most important outcomes of marketing in which both elements serve various stages of consumer buying behavior. Excellent service quality has become a major corporate goal as more companies gradually struggle for quality for their products and services. Therefore, the main purpose of this study is to investigate the influence of service quality on customer satisfaction and customer loyalty at one telecommunication company in Malaysia which is Telekom Malaysia. The scope of this research is to evaluate satisfaction on the products or services at TMpoint Bukit Raja, Malaysia. The data are gathered through the distribution of questionnaires to a total of 306 respondents who visited and used the products or services. By using correlation and multiple regression analyses, the result revealed that there was a positive and significant relationship between service quality and customer satisfaction. The most influential factor on customer satisfaction was empathy followed by reliability, assurance and tangibles. However, there was no significant influence between responsiveness and customer satisfaction. The result also showed there was a positive and significant relationship between service quality and customer loyalty. The most influential factor on customer loyalty was assurance followed by reliability and tangibles. TMpoint Bukit Raja is recommended to device excellent strategies to satisfy customers’ needs and to adopt action-oriented approach by focusing on what the customers wanted. It is also recommended that similar study can be carried out in other industries using different methodologies such as longitudinal method, enlarge the sample size and use a qualitative approach.Keywords: customer satisfaction, customer loyalty, service quality, telecommunication company
Procedia PDF Downloads 45325707 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 26025706 Integrated Model for Enhancing Data Security Performance in Cloud Computing
Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali
Abstract:
Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish
Procedia PDF Downloads 47725705 Setting Ground for Improvement of Knowledge Managament System in the Educational Organization
Authors: Mladen Djuric, Ivan Janicijevic, Sasa Lazarevic
Abstract:
One of the organizational issues is how to develop and shape decision making and knowledge management systems which will continually avoid traps of both paralyses by analyses“ and extinction by instinct“, the concepts that are a kind of tolerant limits anti-patterns which define what we can call decision making and knowledge management patterns control zone. This paper discusses potentials for development of a core base for recognizing, capturing, and analyzing anti-patterns in the educational organization, thus creating a space for improving decision making and knowledge management processes in education.Keywords: anti-patterns, decision making, education, knowledge management
Procedia PDF Downloads 63225704 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 13325703 Challenges in Multi-Cloud Storage Systems for Mobile Devices
Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta
Abstract:
The demand for cloud storage is increasing because users want continuous access their data. Cloud Storage revolutionized the way how users access their data. A lot of cloud storage service providers are available as DropBox, G Drive, and providing limited free storage and for extra storage; users have to pay money, which will act as a burden on users. To avoid the issue of limited free storage, the concept of Multi Cloud Storage introduced. In this paper, we will discuss the limitations of existing Multi Cloud Storage systems for mobile devices.Keywords: cloud storage, data privacy, data security, multi cloud storage, mobile devices
Procedia PDF Downloads 69925702 Dyadic Effect of Emotional Focused Psycho Educational Intervention on Spousal Emotional Abuse and Marital Satisfaction among Elderly Couples
Authors: Maryam Hazrati, Tengku Aizan Hamid, Rahimah Ibrahim, Siti Aishah Hassan, Farkhondeh Sharif, Zahra Bagheri
Abstract:
Background: Emotional abuse is the most common type of spousal abuse. In a long-term marriage which lasts several decades, the couple will be faced with greater vulnerability due to illness, disability, and dependence. Emotional abuse can have a devastating impact on victims, leading to low self-esteem, depression, anxiety, and post-traumatic stress disorder. Research Aim: The aim of this study was to investigate the effects of an emotional-focused psychoeducational intervention (EFPEI) on emotional abuse and marital satisfaction among older adults couples and also to examine the dyadic effects of each partner’s emotional abuse behaviors (EAB) on his/her marital satisfaction (MS) in Shiraz-Iran. Methodology: The study was a randomized controlled trial (RCT). A total of 57 eligible couples were randomly assigned to either the experimental group or the control group. The experimental group received EFPEI, which consisted of 12 sessions, each lasting 90 minutes. The control group did not receive any intervention. Data were collected using demographic questionnaire, Multidimensional Measure of Emotional Abuse (MMEAQ), and Marital Satisfaction Questionnaire for Older People (MSQFOP). The data was analyzed using a variety of statistical methods, including repeated measures ANOVA, path analysis, and correlational analyses. Findings: The results of the study showed that the EFPEI was effective in reducing emotional abuse and increasing marital satisfaction among older adults couples. Specifically, the mean scores for emotional abuse and marital satisfaction were significantly lower in the experimental group than in the control group at the end of the intervention. These effects were maintained at a 3-month follow-up. Moreover, the dyadic analysis revealed that husbands’ EAB had no significant effects on his own marital satisfaction but a significant negative partner effect, while wives’ EAB had significant negative actor and partner effects. Conclusion: The findings of this study provide support for the use of EFPEI as an effective intervention for decreasing emotional abuse and improving marital dissatisfaction among older adults. EFPEI is a short-term, evidence-based intervention that can be delivered by trained professionals. The intervention focuses on helping couples to improve their communication skills, resolve conflict, and build a stronger emotional connection.Keywords: spouse abuse, emotion, aged, satisfaction, dyadic effect
Procedia PDF Downloads 8425701 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches
Authors: Wuttigrai Ngamsirijit
Abstract:
Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.Keywords: decision making, human capital analytics, talent management, talent value chain
Procedia PDF Downloads 18725700 Environmental Restoration Science in New York Harbor - Community Based Restoration Science Hubs, or “STEM Hubs”
Authors: Lauren B. Birney
Abstract:
The project utilizes the Billion Oyster Project (BOP-CCERS) place-based “restoration through education” model to promote computational thinking in NYC high school teachers and their students. Key learning standards such as Next Generation Science Standards and the NYC CS4All Equity and Excellence initiative are used to develop a computer science curriculum that connects students to their Harbor through hands-on activities based on BOP field science and educational programming. Project curriculum development is grounded in BOP-CCERS restoration science activities and data collection, which are enacted by students and educators at two Restoration Science STEM Hubs or conveyed through virtual materials. New York City Public School teachers with relevant experience are recruited as consultants to provide curriculum assessment and design feedback. The completed curriculum units are then conveyed to NYC high school teachers through professional learning events held at the Pace University campus and led by BOP educators. In addition, Pace University educators execute the Summer STEM Institute, an intensive two-week computational thinking camp centered on applying data analysis tools and methods to BOP-CCERS data. Both qualitative and quantitative analyses were performed throughout the five-year study. STEM+C – Community Based Restoration STEM Hubs. STEM Hubs are active scientific restoration sites capable of hosting school and community groups of all grade levels and professional scientists and researchers conducting long-term restoration ecology research. The STEM Hubs program has grown to include 14 STEM Hubs across all five boroughs of New York City and focuses on bringing in-field monitoring experience as well as coastal classroom experience to students. Restoration Science STEM Hubs activities resulted in: the recruitment of 11 public schools, 6 community groups, 12 teachers, and over 120 students receiving exposure to BOP activities. Field science protocols were designed exclusively around the use of the Oyster Restoration Station (ORS), a small-scale in situ experimental platforms which are suspended from a dock or pier. The ORS is intended to be used and “owned” by an individual school, teacher, class, or group of students, whereas the STEM Hub is explicitly designed as a collaborative space for large-scale community-driven restoration work and in-situ experiments. The ORS is also an essential tool in gathering Harbor data from disparate locations and instilling ownership of the research process amongst students. As such, it will continue to be used in that way. New and previously participating students will continue to deploy and monitor their own ORS, uploading data to the digital platform and conducting analysis of their own harbor-wide datasets. Programming the STEM Hub will necessitate establishing working relationships between schools and local research institutions. NYHF will provide introductions and the facilitation of initial workshops in school classrooms. However, once a particular STEM Hub has been established as a space for collaboration, each partner group, school, university, or CBO will schedule its own events at the site using the digital platform’s scheduling and registration tool. Monitoring of research collaborations will be accomplished through the platform’s research publication tool and has thus far provided valuable information on the projects’ trajectory, strategic plan, and pathway.Keywords: environmental science, citizen science, STEM, technology
Procedia PDF Downloads 9625699 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 25925698 Effect of Rotation on Love Wave Propagation in Piezoelectric Medium with Corrugation
Authors: Soniya Chaudhary
Abstract:
The present study analyses the propagation of Love wave in rotating piezoelectric layer lying over an elastic substrate with corrugated boundaries. The appropriate solutions in the considered medium satisfy the required boundary conditions to obtain the dispersion relation of Love wave for charge free as well as electrically shorted cases. The effects of rotation are shown by graphically on the non-dimensional speed of the Love wave. In addition to classical case, some existing results have been deduced as particular case of the present study. The present study may be useful in rotation sensor and SAW devices.Keywords: corrugation, dispersion relation, love wave, piezoelectric
Procedia PDF Downloads 22525697 Comparison of Instantaneous Short Circuit versus Step DC Voltage to Determine PMG Inductances
Authors: Walter Evaldo Kuchenbecker, Julio Carlos Teixeira
Abstract:
Since efficiency became a challenge to reduce energy consumption of all electrical machines applications, the permanent magnet machine raises up as a better option, because its performance, robustness and simple control. Even though, the electrical machine was developed through analyses of magnetism effect, permanent magnet machines still not well dominated. As permanent magnet machines are becoming popular in most applications, the pressure to standardize this type of electrical machine increases. However, due limited domain, it is still nowadays without any standard to manufacture, test and application. In order to determine an inductance of the machine, a new method is proposed.Keywords: permanent magnet generators (pmg), synchronous machine parameters, test procedures, inductances
Procedia PDF Downloads 30325696 Sampled-Data Model Predictive Tracking Control for Mobile Robot
Authors: Wookyong Kwon, Sangmoon Lee
Abstract:
In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV
Procedia PDF Downloads 30925695 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data
Authors: Nasser A. Al-Azri
Abstract:
The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.Keywords: bioclimatic charts, passive cooling, TMY, weather data
Procedia PDF Downloads 24025694 Control of the Sustainability of Decorative Topping for Bakery in Order to Extend the Shelf-Life of the Product
Authors: Radovan Čobanović, Milica Rankov Šicar
Abstract:
In the modern bakery various supplements are used to attract more customers. Analyzed sample decorative toppings are consisted of flax seeds, corn grits, oatmeal, wheat flakes, sesame seeds, sunflower seeds, soybean sprouts are used as decoration for the bread. Our goal was to extend the product shelf life based on the analysis. According to the plan of sustainability it was defined that sample which already had expired shelf life had to be stored for 5 months at 25°C and analyzed every month from the day of reception until spoilage occurs. Samples were subjected to sensory analysis (appearance, odor, taste, color, and consistency), microbiological analysis (Salmonella spp., Bacillus cereus, Enterobacteriaceae and moulds) and chemistry analysis (free fatty acids (as oleic), peroxide number, water content and degree of acidity). All analyses were tested according: sensory analysis ISO 6658, Salmonella spp ISO 6579, Bacillus cereus ISO 7932, Enterobacteriaceae ISO 21528-2 and moulds ISO 21527-1, free fatty acids (as oleic) ISO 660, peroxide number ISO 3960, water content and degree of acidity Serbian ordinance on the methods of chemical analysis. After five months of storage, there had been the first changes concerning of sensory properties of the product. In the sample were visible worms and creations which look like spider nets linking seeds and cereal. The sample had smell on rancid and pungent. The results of microbiological analysis showed that Salmonella spp was not detected, Enterobacteriaceae were < 10 cfu/g during all 5 months but in fifth month Bacillus cereus and moulds occurred 700 cfu/g and 1500 cfu/g respectively. Chemical analyzes showed that the water content did not exceed a maximum of 14%. The content of free fatty acids ranged from 3.06 to 3.26%, degree of acidity from 3.69 to 4.9. With increasing degree of acidity the degradation of the sample and the activity of microorganisms was increased which led to the formation of acid reaction which is accompanied by the appearance of unpleasant odor and taste. Based on the obtained results it can be concluded that this product can have longer shelf life for four months than shelf life which is already defined because there are no changes that could have influence on decision of customers when purchase of this product is concerned.Keywords: bakery products, extension of shelf life, sensory and chemical and microbiological analyses, sustainability
Procedia PDF Downloads 38825693 The Development of Congeneric Elicited Writing Tasks to Capture Language Decline in Alzheimer Patients
Authors: Lise Paesen, Marielle Leijten
Abstract:
People diagnosed with probable Alzheimer disease suffer from an impairment of their language capacities; a gradual impairment which affects both their spoken and written communication. Our study aims at characterising the language decline in DAT patients with the use of congeneric elicited writing tasks. Within these tasks, a descriptive text has to be written based upon images with which the participants are confronted. A randomised set of images allows us to present the participants with a different task on every encounter, thus allowing us to avoid a recognition effect in this iterative study. This method is a revision from previous studies, in which participants were presented with a larger picture depicting an entire scene. In order to create the randomised set of images, existing pictures were adapted following strict criteria (e.g. frequency, AoA, colour, ...). The resulting data set contained 50 images, belonging to several categories (vehicles, animals, humans, and objects). A pre-test was constructed to validate the created picture set; most images had been used before in spoken picture naming tasks. Hence the same reaction times ought to be triggered in the typed picture naming task. Once validated, the effectiveness of the descriptive tasks was assessed. First, the participants (n=60 students, n=40 healthy elderly) performed a typing task, which provided information about the typing speed of each individual. Secondly, two descriptive writing tasks were carried out, one simple and one complex. The simple task contains 4 images (1 animal, 2 objects, 1 vehicle) and only contains elements with high frequency, a young AoA (<6 years), and fast reaction times. Slow reaction times, a later AoA (≥ 6 years) and low frequency were criteria for the complex task. This task uses 6 images (2 animals, 1 human, 2 objects and 1 vehicle). The data were collected with the keystroke logging programme Inputlog. Keystroke logging tools log and time stamp keystroke activity to reconstruct and describe text production processes. The data were analysed using a selection of writing process and product variables, such as general writing process measures, detailed pause analysis, linguistic analysis, and text length. As a covariate, the intrapersonal interkey transition times from the typing task were taken into account. The pre-test indicated that the new images lead to similar or even faster reaction times compared to the original images. All the images were therefore used in the main study. The produced texts of the description tasks were significantly longer compared to previous studies, providing sufficient text and process data for analyses. Preliminary analysis shows that the amount of words produced differed significantly between the healthy elderly and the students, as did the mean length of production bursts, even though both groups needed the same time to produce their texts. However, the elderly took significantly more time to produce the complex task than the simple task. Nevertheless, the amount of words per minute remained comparable between simple and complex. The pauses within and before words varied, even when taking personal typing abilities (obtained by the typing task) into account.Keywords: Alzheimer's disease, experimental design, language decline, writing process
Procedia PDF Downloads 27425692 Turing Pattern in the Oregonator Revisited
Authors: Elragig Aiman, Dreiwi Hanan, Townley Stuart, Elmabrook Idriss
Abstract:
In this paper, we reconsider the analysis of the Oregonator model. We highlight an error in this analysis which leads to an incorrect depiction of the parameter region in which diffusion driven instability is possible. We believe that the cause of the oversight is the complexity of stability analyses based on eigenvalues and the dependence on parameters of matrix minors appearing in stability calculations. We regenerate the parameter space where Turing patterns can be seen, and we use the common Lyapunov function (CLF) approach, which is numerically reliable, to further confirm the dependence of the results on diffusion coefficients intensities.Keywords: diffusion driven instability, common Lyapunov function (CLF), turing pattern, positive-definite matrix
Procedia PDF Downloads 358