Search results for: clustered data
23733 Hybridized Approach for Distance Estimation Using K-Means Clustering
Authors: Ritu Vashistha, Jitender Kumar
Abstract:
Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.Keywords: ant colony optimization, data clustering, centroids, data mining, k-means
Procedia PDF Downloads 12823732 Digital Twin for University Campus: Workflow, Applications and Benefits
Authors: Frederico Fialho Teixeira, Islam Mashaly, Maryam Shafiei, Jurij Karlovsek
Abstract:
The ubiquity of data gathering and smart technologies, advancements in virtual technologies, and the development of the internet of things (IoT) have created urgent demands for the development of frameworks and efficient workflows for data collection, visualisation, and analysis. Digital twin, in different scales of the city into the building, allows for bringing together data from different sources to generate fundamental and illuminating insights for the management of current facilities and the lifecycle of amenities as well as improvement of the performance of current and future designs. Over the past two decades, there has been growing interest in the topic of digital twin and their applications in city and building scales. Most such studies look at the urban environment through a homogeneous or generalist lens and lack specificity in particular characteristics or identities, which define an urban university campus. Bridging this knowledge gap, this paper offers a framework for developing a digital twin for a university campus that, with some modifications, could provide insights for any large-scale digital twin settings like towns and cities. It showcases how currently unused data could be purposefully combined, interpolated and visualised for producing analysis-ready data (such as flood or energy simulations or functional and occupancy maps), highlighting the potential applications of such a framework for campus planning and policymaking. The research integrates campus-level data layers into one spatial information repository and casts light on critical data clusters for the digital twin at the campus level. The paper also seeks to raise insightful and directive questions on how digital twin for campus can be extrapolated to city-scale digital twin. The outcomes of the paper, thus, inform future projects for the development of large-scale digital twin as well as urban and architectural researchers on potential applications of digital twin in future design, management, and sustainable planning, to predict problems, calculate risks, decrease management costs, and improve performance.Keywords: digital twin, smart campus, framework, data collection, point cloud
Procedia PDF Downloads 6823731 Impact of Job Burnout on Job Satisfaction and Job Performance of Front Line Employees in Bank: Moderating Role of Hope and Self-Efficacy
Authors: Huma Khan, Faiza Akhtar
Abstract:
The present study investigates the effects of burnout toward job performance and job satisfaction with the moderating role of hope and self-efficacy. Findings from 310 frontline employees of Pakistani commercial banks (Lahore, Karachi & Islamabad) disclosed burnout has negative significant effects on job performance and job satisfaction. Simple random sampling technique was used to collect data and inferential statistics were applied to analyzed the data. However, results disclosed no moderation effect of hope on burnout, job performance or with job satisfaction. Moreover, Data significantly supported the moderation effect of self-efficacy. Study further shed light on the development of psychological capital. Importance of the implication of the current finding is discussed.Keywords: burnout, hope, job performance, job satisfaction, psychological capital, self-efficacy
Procedia PDF Downloads 14223730 Obstacle Classification Method Based on 2D LIDAR Database
Authors: Moohyun Lee, Soojung Hur, Yongwan Park
Abstract:
In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.Keywords: obstacle, classification, database, LIDAR, segmentation, intensity
Procedia PDF Downloads 34923729 Body Farming in India and Asia
Authors: Yogesh Kumar, Adarsh Kumar
Abstract:
A body farm is a research facility where research is done on forensic investigation and medico-legal disciplines like forensic entomology, forensic pathology, forensic anthropology, forensic archaeology, and related areas of forensic veterinary. All the research is done to collect data on the rate of decomposition (animal and human) and forensically important insects to assist in crime detection. The data collected is used by forensic pathologists, forensic experts, and other experts for the investigation of crime cases and further research. The research work includes different conditions of a dead body like fresh, bloating, decay, dry, and skeleton, and data on local insects which depends on the climatic conditions of the local areas of that country. Therefore, it is the need of time to collect appropriate data in managed conditions with a proper set-up in every country. Hence, it is the duty of the scientific community of every country to establish/propose such facilities for justice and social management. The body farms are also used for training of police, military, investigative dogs, and other agencies. At present, only four countries viz. U.S., Australia, Canada, and Netherlands have body farms and related facilities in organised manner. There is no body farm in Asia also. In India, we have been trying to establish a body farm in A&N Islands that is near Singapore, Malaysia, and some other Asian countries. In view of the above, it becomes imperative to discuss the matter with Asian countries to collect the data on decomposition in a proper manner by establishing a body farm. We can also share the data, knowledge, and expertise to collaborate with one another to make such facilities better and have good scientific relations to promote science and explore ways of investigation at the world level.Keywords: body farm, rate of decomposition, forensically important flies, time since death
Procedia PDF Downloads 8723728 The Impact of Inflation Rate and Interest Rate on Islamic and Conventional Banking in Afghanistan
Authors: Tareq Nikzad
Abstract:
Since the first bank was established in 1933, Afghanistan's banking sector has seen a number of variations but hasn't been able to grow to its full potential because of the civil war. The implementation of dual banks in Afghanistan is investigated in this study in relation to the effects of inflation and interest rates. This research took data from World Bank Data (WBD) over a period of nineteen years. For the banking sector, inflation, which is the general rise in prices of goods and services over time, presents considerable difficulties. The objectives of this research are to analyze the effect of inflation and interest rates on conventional and Islamic banks in Afghanistan, identify potential differences between these two banking models, and provide insights for policymakers and practitioners. A mixed-methods approach is used in the research to analyze quantitative data and qualitatively examine the unique difficulties that banks in Afghanistan's economic atmosphere encounter. The findings contribute to the understanding of the relationship between interest rate, inflation rate, and the performance of both banking systems in Afghanistan. The paper concludes with recommendations for policymakers and banking institutions to enhance the stability and growth of the banking sector in Afghanistan. Interest is described as "a prefixed rate for use or borrowing of money" from an Islamic perspective. This "prefixed rate," known in Islamic economics as "riba," has been described as "something undesirable." Furthermore, by using the time series regression data technique on the annual data from 2003 to 2021, this research examines the effect of CPI inflation rate and interest rate of Banking in Afghanistan.Keywords: inflation, Islamic banking, conventional banking, interest, Afghanistan, impact
Procedia PDF Downloads 7223727 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria
Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur
Abstract:
The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria
Procedia PDF Downloads 37723726 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics
Authors: Janne Engblom, Elias Oikarinen
Abstract:
A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.Keywords: dynamic model, fixed effects, panel data, price dynamics
Procedia PDF Downloads 150823725 Blockchain-Based Assignment Management System
Authors: Amogh Katti, J. Sai Asritha, D. Nivedh, M. Kalyan Srinivas, B. Somnath Chakravarthi
Abstract:
Today's modern education system uses Learning Management System (LMS) portals for the scoring and grading of student performances, to maintain student records, and teachers are instructed to accept assignments through online submissions of .pdf,.doc,.ppt, etc. There is a risk of data tampering in the traditional portals; we will apply the Blockchain model instead of this traditional model to avoid data tampering and also provide a decentralized mechanism for overall fairness. Blockchain technology is a better and also recommended model because of the following features: consensus mechanism, decentralized system, cryptographic encryption, smart contracts, Ethereum blockchain. The proposed system ensures data integrity and tamper-proof assignment submission and grading, which will be helpful for both students and also educators.Keywords: education technology, learning management system, decentralized applications, blockchain
Procedia PDF Downloads 8423724 Trend Analysis of Africa’s Entrepreneurial Framework Conditions
Authors: Sheng-Hung Chen, Grace Mmametena Mahlangu, Hui-Cheng Wang
Abstract:
This study aims to explore the trends of the Entrepreneurial Framework Conditions (EFCs) in the five African regions. The Global Entrepreneur Monitor (GEM) is the primary source of data. The data drawn were organized into a panel (2000-2021) and obtained from the National Expert Survey (NES) databases as harmonized by the (GEM). The Methodology used is descriptive and uses mainly charts and tables; this is in line with the approach used by the GEM. The GEM draws its data from the National Expert Survey (NES). The survey by the NES is administered to experts in each country. The GEM collects entrepreneurship data specific to each country. It provides information about entrepreneurial ecosystems and their impact on entrepreneurship. The secondary source is from the literature review. This study focuses on the following GEM indicators: Financing for Entrepreneurs, Government support and Policies, Taxes and Bureaucracy, Government programs, Basic School Entrepreneurial Education and Training, Post school Entrepreneurial Education and Training, R&D Transfer, Commercial And Professional Infrastructure, Internal Market Dynamics, Internal Market Openness, Physical and Service Infrastructure, and Cultural And Social Norms, based on GEM Report 2020/21. The limitation of the study is the lack of updated data from some countries. Countries have to fund their own regional studies; African countries do not regularly participate due to a lack of resources.Keywords: trend analysis, entrepreneurial framework conditions (EFCs), African region, government programs
Procedia PDF Downloads 7123723 Access to Apprenticeships and the Impact of Individual and School Level Characteristics
Authors: Marianne Dæhlen
Abstract:
Periods of apprenticeships are characteristic of many vocational educational training (VET) systems. In many countries, becoming a skilled worker implies that the journey starts with an application for apprenticeships at a company or another relevant training establishment. In Norway, where this study is conducted, VET students start their journey with two years of school-based training before applying for two years of apprenticeship. Previous research has shown that access to apprenticeships differs by family background (socio-economic, immigrant, etc.), gender, school grades, and region. The question we raise in this study is whether the status, reputation, or position of the vocational school contributes to VET students’ access to apprenticeships. Data and methods: Register data containing information about schools’ and VET students’ characteristics will be analyzed in multilevel regression analyses. At the school level, the data will contain information on school size, shares of immigrants and/or share of male/female students, and grade requirements for admission. At the VET-student level, the register contains information on e.g., gender, school grades, educational program/trade, obtaining apprenticeship or not. The data set comprises about 3,000 students. Results: The register data is expected to be received in November 2024 and consequently, any results are not present at the point of this call. The planned article is part of a larger research project granted from the Norwegian Research Council and will, accordingly to the plan, start up in December 2024.Keywords: apprenticeships, VET-students’ characteristics, vocational schools, quantitative methods
Procedia PDF Downloads 1023722 Data Acquisition System for Automotive Testing According to the European Directive 2004/104/EC
Authors: Herminio Martínez-García, Juan Gámiz, Yolanda Bolea, Antoni Grau
Abstract:
This article presents an interactive system for data acquisition in vehicle testing according to the test process defined in automotive directive 2004/104/EC. The project has been designed and developed by authors for the Spanish company Applus-LGAI. The developed project will result in a new process, which will involve the creation of braking cycle test defined in the aforementioned automotive directive. It will also allow the analysis of new vehicle features that was not feasible, allowing an increasing interaction with the vehicle. Potential users of this system in the short term will be vehicle manufacturers and in a medium term the system can be extended to testing other automotive components and EMC tests.Keywords: automotive process, data acquisition system, electromagnetic compatibility (EMC) testing, European Directive 2004/104/EC
Procedia PDF Downloads 34023721 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30323720 A Review of Spatial Analysis as a Geographic Information Management Tool
Authors: Chidiebere C. Agoha, Armstong C. Awuzie, Chukwuebuka N. Onwubuariri, Joy O. Njoku
Abstract:
Spatial analysis is a field of study that utilizes geographic or spatial information to understand and analyze patterns, relationships, and trends in data. It is characterized by the use of geographic or spatial information, which allows for the analysis of data in the context of its location and surroundings. It is different from non-spatial or aspatial techniques, which do not consider the geographic context and may not provide as complete of an understanding of the data. Spatial analysis is applied in a variety of fields, which includes urban planning, environmental science, geosciences, epidemiology, marketing, to gain insights and make decisions about complex spatial problems. This review paper explores definitions of spatial analysis from various sources, including examples of its application and different analysis techniques such as Buffer analysis, interpolation, and Kernel density analysis (multi-distance spatial cluster analysis). It also contrasts spatial analysis with non-spatial analysis.Keywords: aspatial technique, buffer analysis, epidemiology, interpolation
Procedia PDF Downloads 31923719 IoT Based Agriculture Monitoring Framework for Sustainable Rice Production
Authors: Armanul Hoque Shaon, Md Baizid Mahmud, Askander Nobi, Md. Raju Ahmed, Md. Jiabul Hoque
Abstract:
In the Internet of Things (IoT), devices are linked to the internet through a wireless network, allowing them to collect and transmit data without the need for a human operator. Agriculture relies heavily on wireless sensors, which are a vital component of the Internet of Things (IoT). This kind of wireless sensor network monitors physical or environmental variables like temperatures, sound, vibration, pressure, or motion without relying on a central location or sink and collaboratively passes its data across the network to be analyzed. As the primary source of plant nutrients, the soil is critical to the agricultural industry's continued growth. We're excited about the prospect of developing an Internet of Things (IoT) solution. To arrange the network, the sink node collects groundwater levels and sends them to the Gateway, which centralizes the data and forwards it to the sensor nodes. The sink node gathers soil moisture data, transmits the mean to the Gateways, and then forwards it to the website for dissemination. The web server is in charge of storing and presenting the moisture in the soil data to the web application's users. Soil characteristics may be collected using a networked method that we developed to improve rice production. Paddy land is running out as the population of our nation grows. The success of this project will be dependent on the appropriate use of the existing land base.Keywords: IoT based agriculture monitoring, intelligent irrigation, communicating network, rice production
Procedia PDF Downloads 15423718 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm
Authors: Thanh Noi Phan, Martin Kappas, Jan Degener
Abstract:
The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam
Procedia PDF Downloads 38823717 Constructing the Density of States from the Parallel Wang Landau Algorithm Overlapping Data
Authors: Arman S. Kussainov, Altynbek K. Beisekov
Abstract:
This work focuses on building an efficient universal procedure to construct a single density of states from the multiple pieces of data provided by the parallel implementation of the Wang Landau Monte Carlo based algorithm. The Ising and Pott models were used as the examples of the two-dimensional spin lattices to construct their densities of states. Sampled energy space was distributed between the individual walkers with certain overlaps. This was made to include the latest development of the algorithm as the density of states replica exchange technique. Several factors of immediate importance for the seamless stitching process have being considered. These include but not limited to the speed and universality of the initial parallel algorithm implementation as well as the data post-processing to produce the expected smooth density of states.Keywords: density of states, Monte Carlo, parallel algorithm, Wang Landau algorithm
Procedia PDF Downloads 41223716 Thick Data Analytics for Learning Cataract Severity: A Triplet Loss Siamese Neural Network Model
Authors: Jinan Fiaidhi, Sabah Mohammed
Abstract:
Diagnosing cataract severity is an important factor in deciding to undertake surgery. It is usually conducted by an ophthalmologist or through taking a variety of fundus photography that needs to be examined by the ophthalmologist. This paper carries out an investigation using a Siamese neural net that can be trained with small anchor samples to score cataract severity. The model used in this paper is based on a triplet loss function that takes the ophthalmologist best experience in rating positive and negative anchors to a specific cataract scaling system. This approach that takes the heuristics of the ophthalmologist is generally called the thick data approach, which is a kind of machine learning approach that learn from a few shots. Clinical Relevance: The lens of the eye is mostly made up of water and proteins. A cataract occurs when these proteins at the eye lens start to clump together and block lights causing impair vision. This research aims at employing thick data machine learning techniques to rate the severity of the cataract using Siamese neural network.Keywords: thick data analytics, siamese neural network, triplet-loss model, few shot learning
Procedia PDF Downloads 11123715 Case Study Analysis for Driver's Company in the Transport Sector with the Help of Data Mining
Authors: Diana Katherine Gonzalez Galindo, David Rolando Suarez Mora
Abstract:
With this study, we used data mining as a new alternative of the solution to evaluate the comments of the customers in order to find a pattern that helps us to determine some behaviors to reduce the deactivation of the partners of the LEVEL app. In one of the greatest business created in the last times, the partners are being affected due to an internal process that compensates the customer for a bad experience, but these comments could be false towards the driver, that’s why we made an investigation to collect information to restructure this process, many partners have been disassociated due to this internal process and many of them refuse the comments given by the customer. The main methodology used in this case study is the observation, we recollect information in real time what gave us the opportunity to see the most common issues to get the most accurate solution. With this new process helped by data mining, we could get a prediction based on the behaviors of the customer and some basic data recollected such as the age, the gender, and others; this could help us in future to improve another process. This investigation gives more opportunities to the partner to keep his account active even if the customer writes a message through the app. The term is trying to avoid a recession of drivers in the future offering improving in the processes, at the same time we are in search of stablishing a strategy which benefits both the app’s managers and the associated driver.Keywords: agent, driver, deactivation, rider
Procedia PDF Downloads 28123714 Image Compression Using Block Power Method for SVD Decomposition
Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed
Abstract:
In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless
Procedia PDF Downloads 38723713 Real-Time Pedestrian Detection Method Based on Improved YOLOv3
Authors: Jingting Luo, Yong Wang, Ying Wang
Abstract:
Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3
Procedia PDF Downloads 14223712 Multichannel Analysis of the Surface Waves of Earth Materials in Some Parts of Lagos State, Nigeria
Authors: R. B. Adegbola, K. F. Oyedele, L. Adeoti
Abstract:
We present a method that utilizes Multi-channel Analysis of Surface Waves, which was used to measure shear wave velocities with a view to establishing the probable causes of road failure, subsidence and weakening of structures in some Local Government Area, Lagos, Nigeria. Multi channel Analysis of Surface waves (MASW) data were acquired using 24-channel seismograph. The acquired data were processed and transformed into two-dimensional (2-D) structure reflective of depth and surface wave velocity distribution within a depth of 0–15m beneath the surface using SURFSEIS software. The shear wave velocity data were compared with other geophysical/borehole data that were acquired along the same profile. The comparison and correlation illustrates the accuracy and consistency of MASW derived-shear wave velocity profiles. Rigidity modulus and N-value were also generated. The study showed that the low velocity/very low velocity are reflective of organic clay/peat materials and thus likely responsible for the failed, subsidence/weakening of structures within the study areas.Keywords: seismograph, road failure, rigidity modulus, N-value, subsidence
Procedia PDF Downloads 36323711 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area
Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras
Abstract:
Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.Keywords: empirical models, RMSE, shear wave velocity, standard penetration test
Procedia PDF Downloads 33823710 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites
Authors: Che-Wei Lee, Bay-Erl Lai
Abstract:
A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.Keywords: steganography, data hiding, secret authentication, secret sharing
Procedia PDF Downloads 24323709 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications
Authors: T. Gangadhararao, K. Krishna Kishore
Abstract:
Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code
Procedia PDF Downloads 43223708 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base
Authors: Omid Khodabakhshi, Amir Rozdel
Abstract:
The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.Keywords: code, cloud computing, security, virtual machines
Procedia PDF Downloads 19123707 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran
Authors: Rojin Bana Derakhshan, Abbas Toloie
Abstract:
For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.Keywords: energy saving, key elements of success, optimization of energy consumption, data mining
Procedia PDF Downloads 46823706 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach
Authors: Kwaku Damoah
Abstract:
This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis
Procedia PDF Downloads 4823705 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures
Authors: Salima Kouici, Abdelkader Khelladi
Abstract:
In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering
Procedia PDF Downloads 48223704 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies
Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal
Abstract:
Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model
Procedia PDF Downloads 226