Search results for: data security
23683 Impact of Job Burnout on Job Satisfaction and Job Performance of Front Line Employees in Bank: Moderating Role of Hope and Self-Efficacy
Authors: Huma Khan, Faiza Akhtar
Abstract:
The present study investigates the effects of burnout toward job performance and job satisfaction with the moderating role of hope and self-efficacy. Findings from 310 frontline employees of Pakistani commercial banks (Lahore, Karachi & Islamabad) disclosed burnout has negative significant effects on job performance and job satisfaction. Simple random sampling technique was used to collect data and inferential statistics were applied to analyzed the data. However, results disclosed no moderation effect of hope on burnout, job performance or with job satisfaction. Moreover, Data significantly supported the moderation effect of self-efficacy. Study further shed light on the development of psychological capital. Importance of the implication of the current finding is discussed.Keywords: burnout, hope, job performance, job satisfaction, psychological capital, self-efficacy
Procedia PDF Downloads 14123682 Four Phase Methodology for Developing Secure Software
Authors: Carlos Gonzalez-Flores, Ernesto Liñan-García
Abstract:
A simple and robust approach for developing secure software. A Four Phase methodology consists in developing the non-secure software in phase one, and for the next three phases, one phase for each of the secure developing types (i.e. self-protected software, secure code transformation, and the secure shield). Our methodology requires first the determination and understanding of the type of security level needed for the software. The methodology proposes the use of several teams to accomplish this task. One Software Engineering Developing Team, a Compiler Team, a Specification and Requirements Testing Team, and for each of the secure software developing types: three teams of Secure Software Developing, three teams of Code Breakers, and three teams of Intrusion Analysis. These teams will interact among each other and make decisions to provide a secure software code protected against a required level of intruder.Keywords: secure software, four phases methodology, software engineering, code breakers, intrusion analysis
Procedia PDF Downloads 39923681 Obstacle Classification Method Based on 2D LIDAR Database
Authors: Moohyun Lee, Soojung Hur, Yongwan Park
Abstract:
In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.Keywords: obstacle, classification, database, LIDAR, segmentation, intensity
Procedia PDF Downloads 34923680 Body Farming in India and Asia
Authors: Yogesh Kumar, Adarsh Kumar
Abstract:
A body farm is a research facility where research is done on forensic investigation and medico-legal disciplines like forensic entomology, forensic pathology, forensic anthropology, forensic archaeology, and related areas of forensic veterinary. All the research is done to collect data on the rate of decomposition (animal and human) and forensically important insects to assist in crime detection. The data collected is used by forensic pathologists, forensic experts, and other experts for the investigation of crime cases and further research. The research work includes different conditions of a dead body like fresh, bloating, decay, dry, and skeleton, and data on local insects which depends on the climatic conditions of the local areas of that country. Therefore, it is the need of time to collect appropriate data in managed conditions with a proper set-up in every country. Hence, it is the duty of the scientific community of every country to establish/propose such facilities for justice and social management. The body farms are also used for training of police, military, investigative dogs, and other agencies. At present, only four countries viz. U.S., Australia, Canada, and Netherlands have body farms and related facilities in organised manner. There is no body farm in Asia also. In India, we have been trying to establish a body farm in A&N Islands that is near Singapore, Malaysia, and some other Asian countries. In view of the above, it becomes imperative to discuss the matter with Asian countries to collect the data on decomposition in a proper manner by establishing a body farm. We can also share the data, knowledge, and expertise to collaborate with one another to make such facilities better and have good scientific relations to promote science and explore ways of investigation at the world level.Keywords: body farm, rate of decomposition, forensically important flies, time since death
Procedia PDF Downloads 8723679 Analysing the Perception of Climate Hazards on Biodiversity Conservation in Mining Landscapes within Southwestern Ghana
Authors: Salamatu Shaibu, Jan Hernning Sommer
Abstract:
Integrating biodiversity conservation practices in mining landscapes ensures the continual provision of various ecosystem services to the dependent communities whilst serving as ecological insurance for corporate mining when purchasing reclamation security bonds. Climate hazards such as long dry seasons, erratic rainfall patterns, and extreme weather events contribute to biodiversity loss in addition to the impact due to mining. Both corporate mining and mine-fringe communities perceive the effect of climate on biodiversity from the context of the benefits they accrue, which motivate their conservation practices. In this study, pragmatic approaches including semi-structured interviews, field visual observation, and review were used to collect data on corporate mining employees and households of fringing communities in the southwestern mining hub. The perceived changes in the local climatic conditions and the consequences on environmental management practices that promote biodiversity conservation were examined. Using a thematic content analysis tool, the result shows that best practices such as concurrent land rehabilitation, reclamation ponds, artificial wetlands, land clearance, and topsoil management are directly affected by prolonging long dry seasons and erratic rainfall patterns. Excessive dust and noise generation directly affect both floral and faunal diversity coupled with excessive fire outbreaks in rehabilitated lands and nearby forest reserves. Proposed adaptive measures include engaging national conservation authorities to promote reforestation projects around forest reserves. National government to desist from using permit for mining concessions in forest reserves, engaging local communities through educational campaigns to control forest encroachment and burning, promoting community-based resource management to promote community ownership, and provision of stricter environmental legislation to compel corporate, artisanal, and small scale mining companies to promote biodiversity conservation.Keywords: biodiversity conservation, climate hazards, corporate mining, mining landscapes
Procedia PDF Downloads 21923678 The Impact of Inflation Rate and Interest Rate on Islamic and Conventional Banking in Afghanistan
Authors: Tareq Nikzad
Abstract:
Since the first bank was established in 1933, Afghanistan's banking sector has seen a number of variations but hasn't been able to grow to its full potential because of the civil war. The implementation of dual banks in Afghanistan is investigated in this study in relation to the effects of inflation and interest rates. This research took data from World Bank Data (WBD) over a period of nineteen years. For the banking sector, inflation, which is the general rise in prices of goods and services over time, presents considerable difficulties. The objectives of this research are to analyze the effect of inflation and interest rates on conventional and Islamic banks in Afghanistan, identify potential differences between these two banking models, and provide insights for policymakers and practitioners. A mixed-methods approach is used in the research to analyze quantitative data and qualitatively examine the unique difficulties that banks in Afghanistan's economic atmosphere encounter. The findings contribute to the understanding of the relationship between interest rate, inflation rate, and the performance of both banking systems in Afghanistan. The paper concludes with recommendations for policymakers and banking institutions to enhance the stability and growth of the banking sector in Afghanistan. Interest is described as "a prefixed rate for use or borrowing of money" from an Islamic perspective. This "prefixed rate," known in Islamic economics as "riba," has been described as "something undesirable." Furthermore, by using the time series regression data technique on the annual data from 2003 to 2021, this research examines the effect of CPI inflation rate and interest rate of Banking in Afghanistan.Keywords: inflation, Islamic banking, conventional banking, interest, Afghanistan, impact
Procedia PDF Downloads 7223677 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria
Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur
Abstract:
The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria
Procedia PDF Downloads 37723676 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics
Authors: Janne Engblom, Elias Oikarinen
Abstract:
A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.Keywords: dynamic model, fixed effects, panel data, price dynamics
Procedia PDF Downloads 150823675 Blockchain-Based Assignment Management System
Authors: Amogh Katti, J. Sai Asritha, D. Nivedh, M. Kalyan Srinivas, B. Somnath Chakravarthi
Abstract:
Today's modern education system uses Learning Management System (LMS) portals for the scoring and grading of student performances, to maintain student records, and teachers are instructed to accept assignments through online submissions of .pdf,.doc,.ppt, etc. There is a risk of data tampering in the traditional portals; we will apply the Blockchain model instead of this traditional model to avoid data tampering and also provide a decentralized mechanism for overall fairness. Blockchain technology is a better and also recommended model because of the following features: consensus mechanism, decentralized system, cryptographic encryption, smart contracts, Ethereum blockchain. The proposed system ensures data integrity and tamper-proof assignment submission and grading, which will be helpful for both students and also educators.Keywords: education technology, learning management system, decentralized applications, blockchain
Procedia PDF Downloads 8423674 Trend Analysis of Africa’s Entrepreneurial Framework Conditions
Authors: Sheng-Hung Chen, Grace Mmametena Mahlangu, Hui-Cheng Wang
Abstract:
This study aims to explore the trends of the Entrepreneurial Framework Conditions (EFCs) in the five African regions. The Global Entrepreneur Monitor (GEM) is the primary source of data. The data drawn were organized into a panel (2000-2021) and obtained from the National Expert Survey (NES) databases as harmonized by the (GEM). The Methodology used is descriptive and uses mainly charts and tables; this is in line with the approach used by the GEM. The GEM draws its data from the National Expert Survey (NES). The survey by the NES is administered to experts in each country. The GEM collects entrepreneurship data specific to each country. It provides information about entrepreneurial ecosystems and their impact on entrepreneurship. The secondary source is from the literature review. This study focuses on the following GEM indicators: Financing for Entrepreneurs, Government support and Policies, Taxes and Bureaucracy, Government programs, Basic School Entrepreneurial Education and Training, Post school Entrepreneurial Education and Training, R&D Transfer, Commercial And Professional Infrastructure, Internal Market Dynamics, Internal Market Openness, Physical and Service Infrastructure, and Cultural And Social Norms, based on GEM Report 2020/21. The limitation of the study is the lack of updated data from some countries. Countries have to fund their own regional studies; African countries do not regularly participate due to a lack of resources.Keywords: trend analysis, entrepreneurial framework conditions (EFCs), African region, government programs
Procedia PDF Downloads 7123673 Access to Apprenticeships and the Impact of Individual and School Level Characteristics
Authors: Marianne Dæhlen
Abstract:
Periods of apprenticeships are characteristic of many vocational educational training (VET) systems. In many countries, becoming a skilled worker implies that the journey starts with an application for apprenticeships at a company or another relevant training establishment. In Norway, where this study is conducted, VET students start their journey with two years of school-based training before applying for two years of apprenticeship. Previous research has shown that access to apprenticeships differs by family background (socio-economic, immigrant, etc.), gender, school grades, and region. The question we raise in this study is whether the status, reputation, or position of the vocational school contributes to VET students’ access to apprenticeships. Data and methods: Register data containing information about schools’ and VET students’ characteristics will be analyzed in multilevel regression analyses. At the school level, the data will contain information on school size, shares of immigrants and/or share of male/female students, and grade requirements for admission. At the VET-student level, the register contains information on e.g., gender, school grades, educational program/trade, obtaining apprenticeship or not. The data set comprises about 3,000 students. Results: The register data is expected to be received in November 2024 and consequently, any results are not present at the point of this call. The planned article is part of a larger research project granted from the Norwegian Research Council and will, accordingly to the plan, start up in December 2024.Keywords: apprenticeships, VET-students’ characteristics, vocational schools, quantitative methods
Procedia PDF Downloads 923672 Data Acquisition System for Automotive Testing According to the European Directive 2004/104/EC
Authors: Herminio Martínez-García, Juan Gámiz, Yolanda Bolea, Antoni Grau
Abstract:
This article presents an interactive system for data acquisition in vehicle testing according to the test process defined in automotive directive 2004/104/EC. The project has been designed and developed by authors for the Spanish company Applus-LGAI. The developed project will result in a new process, which will involve the creation of braking cycle test defined in the aforementioned automotive directive. It will also allow the analysis of new vehicle features that was not feasible, allowing an increasing interaction with the vehicle. Potential users of this system in the short term will be vehicle manufacturers and in a medium term the system can be extended to testing other automotive components and EMC tests.Keywords: automotive process, data acquisition system, electromagnetic compatibility (EMC) testing, European Directive 2004/104/EC
Procedia PDF Downloads 33923671 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30323670 A Review of Spatial Analysis as a Geographic Information Management Tool
Authors: Chidiebere C. Agoha, Armstong C. Awuzie, Chukwuebuka N. Onwubuariri, Joy O. Njoku
Abstract:
Spatial analysis is a field of study that utilizes geographic or spatial information to understand and analyze patterns, relationships, and trends in data. It is characterized by the use of geographic or spatial information, which allows for the analysis of data in the context of its location and surroundings. It is different from non-spatial or aspatial techniques, which do not consider the geographic context and may not provide as complete of an understanding of the data. Spatial analysis is applied in a variety of fields, which includes urban planning, environmental science, geosciences, epidemiology, marketing, to gain insights and make decisions about complex spatial problems. This review paper explores definitions of spatial analysis from various sources, including examples of its application and different analysis techniques such as Buffer analysis, interpolation, and Kernel density analysis (multi-distance spatial cluster analysis). It also contrasts spatial analysis with non-spatial analysis.Keywords: aspatial technique, buffer analysis, epidemiology, interpolation
Procedia PDF Downloads 31823669 Green Procedure for Energy and Emission Balancing of Alternative Scenario Improvements for Cogeneration System: A Case of Hardwood Lumber Manufacturing Process
Authors: Aldona Kluczek
Abstract:
Energy efficient process have become a pressing research field in manufacturing. The arguments for having an effective industrial energy efficiency processes are interacted with factors: economic and environmental impact, and energy security. Improvements in energy efficiency are most often achieved by implementation of more efficient technology or manufacturing process. Current processes of electricity production represents the biggest consumption of energy and the greatest amount of emissions to the environment. The goal of this study is to improve the potential energy-savings and reduce greenhouse emissions related to improvement scenarios for the treatment of hardwood lumber produced by an industrial plant operating in the U.S. through the application of green balancing procedure, in order to find the preferable efficient technology. The green procedure for energy is based on analysis of energy efficiency data. Three alternative scenarios of the cogeneration systems plant (CHP) construction are considered: generation of fresh steam, the purchase of a new boiler with the operating pressure 300 pounds per square inch gauge (PSIG), an installation of a new boiler with a 600 PSIG pressure. In this paper, the application of a bottom-down modelling for energy flow to devise a streamlined Energy and Emission Flow Analyze method for the technology of producing electricity is illustrated. It will identify efficiency or technology of a given process to be reached, through the effective use of energy, or energy management. Results have shown that the third scenario seem to be the efficient alternative scenario considered from the environmental and economic concerns for treating hardwood lumber. The energy conservation evaluation options could save an estimated 6,215.78 MMBtu/yr in each year, which represents 9.5% of the total annual energy usage. The total annual potential cost savings from all recommendations is $143,523/yr, which represents 30.1% of the total annual energy costs. Estimation have presented that energy cost savings are possible up to 43% (US$ 143,337.85), representing 18.6% of the total annual energy costs.Keywords: alternative scenario improvements, cogeneration system, energy and emission flow analyze, energy balancing, green procedure, hardwood lumber manufacturing process
Procedia PDF Downloads 20823668 IoT Based Agriculture Monitoring Framework for Sustainable Rice Production
Authors: Armanul Hoque Shaon, Md Baizid Mahmud, Askander Nobi, Md. Raju Ahmed, Md. Jiabul Hoque
Abstract:
In the Internet of Things (IoT), devices are linked to the internet through a wireless network, allowing them to collect and transmit data without the need for a human operator. Agriculture relies heavily on wireless sensors, which are a vital component of the Internet of Things (IoT). This kind of wireless sensor network monitors physical or environmental variables like temperatures, sound, vibration, pressure, or motion without relying on a central location or sink and collaboratively passes its data across the network to be analyzed. As the primary source of plant nutrients, the soil is critical to the agricultural industry's continued growth. We're excited about the prospect of developing an Internet of Things (IoT) solution. To arrange the network, the sink node collects groundwater levels and sends them to the Gateway, which centralizes the data and forwards it to the sensor nodes. The sink node gathers soil moisture data, transmits the mean to the Gateways, and then forwards it to the website for dissemination. The web server is in charge of storing and presenting the moisture in the soil data to the web application's users. Soil characteristics may be collected using a networked method that we developed to improve rice production. Paddy land is running out as the population of our nation grows. The success of this project will be dependent on the appropriate use of the existing land base.Keywords: IoT based agriculture monitoring, intelligent irrigation, communicating network, rice production
Procedia PDF Downloads 15423667 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm
Authors: Thanh Noi Phan, Martin Kappas, Jan Degener
Abstract:
The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam
Procedia PDF Downloads 38823666 Constructing the Density of States from the Parallel Wang Landau Algorithm Overlapping Data
Authors: Arman S. Kussainov, Altynbek K. Beisekov
Abstract:
This work focuses on building an efficient universal procedure to construct a single density of states from the multiple pieces of data provided by the parallel implementation of the Wang Landau Monte Carlo based algorithm. The Ising and Pott models were used as the examples of the two-dimensional spin lattices to construct their densities of states. Sampled energy space was distributed between the individual walkers with certain overlaps. This was made to include the latest development of the algorithm as the density of states replica exchange technique. Several factors of immediate importance for the seamless stitching process have being considered. These include but not limited to the speed and universality of the initial parallel algorithm implementation as well as the data post-processing to produce the expected smooth density of states.Keywords: density of states, Monte Carlo, parallel algorithm, Wang Landau algorithm
Procedia PDF Downloads 41223665 Thick Data Analytics for Learning Cataract Severity: A Triplet Loss Siamese Neural Network Model
Authors: Jinan Fiaidhi, Sabah Mohammed
Abstract:
Diagnosing cataract severity is an important factor in deciding to undertake surgery. It is usually conducted by an ophthalmologist or through taking a variety of fundus photography that needs to be examined by the ophthalmologist. This paper carries out an investigation using a Siamese neural net that can be trained with small anchor samples to score cataract severity. The model used in this paper is based on a triplet loss function that takes the ophthalmologist best experience in rating positive and negative anchors to a specific cataract scaling system. This approach that takes the heuristics of the ophthalmologist is generally called the thick data approach, which is a kind of machine learning approach that learn from a few shots. Clinical Relevance: The lens of the eye is mostly made up of water and proteins. A cataract occurs when these proteins at the eye lens start to clump together and block lights causing impair vision. This research aims at employing thick data machine learning techniques to rate the severity of the cataract using Siamese neural network.Keywords: thick data analytics, siamese neural network, triplet-loss model, few shot learning
Procedia PDF Downloads 11123664 Case Study Analysis for Driver's Company in the Transport Sector with the Help of Data Mining
Authors: Diana Katherine Gonzalez Galindo, David Rolando Suarez Mora
Abstract:
With this study, we used data mining as a new alternative of the solution to evaluate the comments of the customers in order to find a pattern that helps us to determine some behaviors to reduce the deactivation of the partners of the LEVEL app. In one of the greatest business created in the last times, the partners are being affected due to an internal process that compensates the customer for a bad experience, but these comments could be false towards the driver, that’s why we made an investigation to collect information to restructure this process, many partners have been disassociated due to this internal process and many of them refuse the comments given by the customer. The main methodology used in this case study is the observation, we recollect information in real time what gave us the opportunity to see the most common issues to get the most accurate solution. With this new process helped by data mining, we could get a prediction based on the behaviors of the customer and some basic data recollected such as the age, the gender, and others; this could help us in future to improve another process. This investigation gives more opportunities to the partner to keep his account active even if the customer writes a message through the app. The term is trying to avoid a recession of drivers in the future offering improving in the processes, at the same time we are in search of stablishing a strategy which benefits both the app’s managers and the associated driver.Keywords: agent, driver, deactivation, rider
Procedia PDF Downloads 28023663 Image Compression Using Block Power Method for SVD Decomposition
Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed
Abstract:
In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless
Procedia PDF Downloads 38723662 Multichannel Analysis of the Surface Waves of Earth Materials in Some Parts of Lagos State, Nigeria
Authors: R. B. Adegbola, K. F. Oyedele, L. Adeoti
Abstract:
We present a method that utilizes Multi-channel Analysis of Surface Waves, which was used to measure shear wave velocities with a view to establishing the probable causes of road failure, subsidence and weakening of structures in some Local Government Area, Lagos, Nigeria. Multi channel Analysis of Surface waves (MASW) data were acquired using 24-channel seismograph. The acquired data were processed and transformed into two-dimensional (2-D) structure reflective of depth and surface wave velocity distribution within a depth of 0–15m beneath the surface using SURFSEIS software. The shear wave velocity data were compared with other geophysical/borehole data that were acquired along the same profile. The comparison and correlation illustrates the accuracy and consistency of MASW derived-shear wave velocity profiles. Rigidity modulus and N-value were also generated. The study showed that the low velocity/very low velocity are reflective of organic clay/peat materials and thus likely responsible for the failed, subsidence/weakening of structures within the study areas.Keywords: seismograph, road failure, rigidity modulus, N-value, subsidence
Procedia PDF Downloads 36323661 Nursing Students’ Opinions about Theoretical Lessons and Clinical Area: A Survey in a Nursing Department
Authors: Ergin Toros, Manar Aslan
Abstract:
This study was planned as a descriptive study in order to learn the opinions of the students who are studying in nursing undergraduate program about their theoretical/practical lessons and departments. The education in the undergraduate nursing programs has great importance because it contains the knowledge and skills to prepare student nurses to the clinic in the future. In order to provide quality-nursing services in the future, the quality of nursing education should be measured, and opinions of student nurses about education should be taken. The research population was composed of students educated in a university with 1-4 years of theoretical and clinical education (N=550), and the sample was composed of 460 students that accepted to take part in the study. It was reached to 83.6% of target population. Data collected through a survey developed by the researchers. Survey consists of 48 questions about sociodemographic characteristics (9 questions), theoretical courses (9 questions), laboratory applications (7 questions), clinical education (14 questions) and services provided by the faculty (9 questions). It was determined that 83.3% of the nursing students found the nursing profession to be suitable for them, 53% of them selected nursing because of easy job opportunity, and 48.9% of them stayed in state dormitory. Regarding the theoretical courses, 84.6% of the students were determined to agree that the question ‘Course schedule is prepared before the course and published on the university web page.’ 28.7% of them were determined to do not agree that the question ‘Feedback is given to students about the assignments they prepare.’. It has been determined that 41,5% of the students agreed that ‘The time allocated to laboratory applications is sufficient.’ Students said that physical conditions in laboratory (41,5%), and the materials used are insufficient (44.6%), and ‘The number of students in the group is not appropriate for laboratory applications.’ (45.2%). 71.3% of the students think that the nurses view in the clinics the students as a tool to remove the workload, 40.7% of them reported that nurses in the clinic area did not help through the purposes of the course, 39.6% of them said that nurses' communication with students is not good. 37.8% of students stated that nurses did not provide orientation to students, 37.2% of them think that nurses are not role models for students. 53.7% of the students stated that the incentive and support for the student exchange program were insufficient., %48 of the students think that career planning services, %47.2 security services,%45.4 the advisor spent time with students are not enough. It has been determined that nursing students are most disturbed by the approach of the nurses in the clinical area within the undergraduate education program. The clinical area education which is considered as an integral part of nursing education is important and affect to student satisfaction.Keywords: nursing education, student, clinical area, opinion
Procedia PDF Downloads 17623660 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area
Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras
Abstract:
Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.Keywords: empirical models, RMSE, shear wave velocity, standard penetration test
Procedia PDF Downloads 33823659 Synthesis of Highly Valuable Fuel Fractions from Waste Date Seeds Oil
Authors: Farrukh Jamil, Ala'A H. Al-Muhtaseb, Lamya Al-Haj, Mohab A. Al-Hinai
Abstract:
Environmental problems and the security of energy supply have motivated the attention in the expansion of alternatives for fossil based fuels. Biomass has been recognized as a capable resource because it is plentifully available and in principle carbon dioxide neutral. Present study focuses on utilization date seeds oil for synthesizing high value fuels formulations such as green diesel and jet fuel. The hydrodeoxygenation of date seeds oil occurred to be highly efficient at following operating conditions temperature 300°C pressure 10bar with continuous stirring at 500 rpm. Products characterization revealed the efficiency of hydrodeoxygenation by formation of linear hydrocarbons (paraffin) in larger fraction. Based on the type of components in product oil it was calculated that maximum fraction lies within the range of green diesel 72.78 % then jet fuel 28.25 % by using Pt/C catalyst. It can be concluded that waste date seeds oil has potential to be used for obtaining high value products.Keywords: date seeds, hydrodeoxygenation, paraffin, deoxygenation
Procedia PDF Downloads 26423658 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites
Authors: Che-Wei Lee, Bay-Erl Lai
Abstract:
A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.Keywords: steganography, data hiding, secret authentication, secret sharing
Procedia PDF Downloads 24323657 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications
Authors: T. Gangadhararao, K. Krishna Kishore
Abstract:
Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code
Procedia PDF Downloads 43123656 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran
Authors: Rojin Bana Derakhshan, Abbas Toloie
Abstract:
For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.Keywords: energy saving, key elements of success, optimization of energy consumption, data mining
Procedia PDF Downloads 46823655 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach
Authors: Kwaku Damoah
Abstract:
This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis
Procedia PDF Downloads 4823654 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures
Authors: Salima Kouici, Abdelkader Khelladi
Abstract:
In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering
Procedia PDF Downloads 481