Search results for: motion data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26774

Search results for: motion data acquisition

23384 Coupling Static Multiple Light Scattering Technique With the Hansen Approach to Optimize Dispersibility and Stability of Particle Dispersions

Authors: Guillaume Lemahieu, Matthias Sentis, Giovanni Brambilla, Gérard Meunier

Abstract:

Static Multiple Light Scattering (SMLS) has been shown to be a straightforward technique for the characterization of colloidal dispersions without dilution, as multiply scattered light in backscattered and transmitted mode is directly related to the concentration and size of scatterers present in the sample. In this view, the use of SMLS for stability measurement of various dispersion types has already been widely described in the literature. Indeed, starting from a homogeneous dispersion, the variation of backscattered or transmitted light can be attributed to destabilization phenomena, such as migration (sedimentation, creaming) or particle size variation (flocculation, aggregation). In a view to investigating more on the dispersibility of colloidal suspensions, an experimental set-up for “at the line” SMLS experiment has been developed to understand the impact of the formulation parameters on particle size and dispersibility. The SMLS experiment is performed with a high acquisition rate (up to 10 measurements per second), without dilution, and under direct agitation. Using such experimental device, SMLS detection can be combined with the Hansen approach to optimize the dispersing and stabilizing properties of TiO₂ particles. It appears that the dispersibility and the stability spheres generated are clearly separated, arguing that lower stability is not necessarily a consequence of poor dispersibility. Beyond this clarification, this combined SMLS-Hansen approach is a major step toward the optimization of dispersibility and stability of colloidal formulations by finding solvents having the best compromise between dispersing and stabilizing properties. Such study can be intended to find better dispersion media, greener and cheaper solvents to optimize particles suspensions, reduce the content of costly stabilizing additives or satisfy product regulatory requirements evolution in various industrial fields using suspensions (paints & inks, coatings, cosmetics, energy).

Keywords: dispersibility, stability, Hansen parameters, particles, solvents

Procedia PDF Downloads 114
23383 Research on Hangzhou Commercial Center System Based on Point of Interest Data

Authors: Chen Wang, Qiuxiao Chen

Abstract:

With the advent of the information age and the era of big data, urban planning research is no longer satisfied with the analysis and application of traditional data. Because of the limitations of traditional urban commercial center system research, big data provides new opportunities for urban research. Therefore, based on the quantitative evaluation method of big data, the commercial center system of the main city of Hangzhou is analyzed and evaluated, and the scale and hierarchical structure characteristics of the urban commercial center system are studied. In order to make up for the shortcomings of the existing POI extraction method, it proposes a POI extraction method based on adaptive adjustment of search window, which can accurately and efficiently extract the POI data of commercial business in the main city of Hangzhou. Through the visualization and nuclear density analysis of the extracted Point of Interest (POI) data, the current situation of the commercial center system in the main city of Hangzhou is evaluated. Then it compares with the commercial center system structure of 'Hangzhou City Master Plan (2001-2020)', analyzes the problems existing in the planned urban commercial center system, and provides corresponding suggestions and optimization strategy for the optimization of the planning of Hangzhou commercial center system. Then get the following conclusions: The status quo of the commercial center system in the main city of Hangzhou presents a first-level main center, a two-level main center, three third-level sub-centers, and multiple community-level business centers. Generally speaking, the construction of the main center in the commercial center system is basically up to standard, and there is still a big gap in the construction of the sub-center and the regional-level commercial center, further construction is needed. Therefore, it proposes an optimized hierarchical functional system, organizes commercial centers in an orderly manner; strengthens the central radiation to drive surrounding areas; implements the construction guidance of the center, effectively promotes the development of group formation and further improves the commercial center system structure of the main city of Hangzhou.

Keywords: business center system, business format, main city of Hangzhou, POI extraction method

Procedia PDF Downloads 141
23382 Stakeholder Analysis of Agricultural Drone Policy: A Case Study of the Agricultural Drone Ecosystem of Thailand

Authors: Thanomsin Chakreeves, Atichat Preittigun, Ajchara Phu-ang

Abstract:

This paper presents a stakeholder analysis of agricultural drone policies that meet the government's goal of building an agricultural drone ecosystem in Thailand. Firstly, case studies from other countries are reviewed. The stakeholder analysis method and qualitative data from the interviews are then presented including data from the Institute of Innovation and Management, the Office of National Higher Education Science Research and Innovation Policy Council, agricultural entrepreneurs and farmers. Study and interview data are then employed to describe the current ecosystem and to guide the implementation of agricultural drone policies that are suitable for the ecosystem of Thailand. Finally, policy recommendations are then made that the Thai government should adopt in the future.

Keywords: drone public policy, drone ecosystem, policy development, agricultural drone

Procedia PDF Downloads 151
23381 Molecular Migration in Polyvinyl Acetate Matrix: Impact of Compatibility, Number of Migrants and Stress on Surface and Internal Microstructure

Authors: O. Squillace, R. L. Thompson

Abstract:

Migration of small molecules to, and across the surface of polymer matrices is a little-studied problem with important industrial applications. Tackifiers in adhesives, flavors in foods and binding agents in paints all present situations where the function of a product depends on the ability of small molecules to migrate through a polymer matrix to achieve the desired properties such as softness, dispersion of fillers, and to deliver an effect that is felt (or tasted) on a surface. It’s been shown that the chemical and molecular structure, surface free energies, phase behavior, close environment and compatibility of the system, influence the migrants’ motion. When differences in behavior, such as occurrence of segregation to the surface or not, are observed it is then of crucial importance to identify and get a better understanding of the driving forces involved in the process of molecular migration. In this aim, experience is meant to be allied with theory in order to deliver a validated theoretical and computational toolkit to describe and predict these phenomena. The systems that have been chosen for this study aim to address the effect of polarity mismatch between the migrants and the polymer matrix and that of a second migrant over the first one. As a non-polar resin polymer, polyvinyl acetate is used as the material to which more or less polar migrants (sorbitol, carvone, octanoic acid (OA), triacetin) are to be added. Through contact angle measurement a surface excess is seen for sorbitol (polar) mixed with PVAc as the surface energy is lowered compare to the one of pure PVAc. This effect is increased upon the addition of carvon or triacetin (non-polars). Surface micro-structures are also evidenced by atomic force microscopy (AFM). Ion beam analysis (Nuclear Reaction Analysis), supplemented by neutron reflectometry can accurately characterize the self-organization of surfactants, oligomers, aromatic molecules in polymer films in order to relate the macroscopic behavior to the length scales that are amenable to simulation. The nuclear reaction analysis (NRA) data for deuterated OA 20% shows the evidence of a surface excess which is enhanced after annealing. The addition of 10% triacetin, as a second migrant, results in the formation of an underlying layer enriched in triacetin below the surface excess of OA. The results show that molecules in polarity mismatch with the matrix tend to segregate to the surface, and this is favored by the addition of a second migrant of the same polarity than the matrix. As studies have been restricted to materials that are model supported films under static conditions in a first step, it is also wished to address the more challenging conditions of materials under controlled stress or strain. To achieve this, a simple rig and PDMS cell have been designed to stretch the material to a defined strain and to probe these mechanical effects by ion beam analysis and atomic force microscopy. This will make a significant step towards exploring the influence of extensional strain on surface segregation, flavor release in cross-linked rubbers.

Keywords: polymers, surface segregation, thin films, molecular migration

Procedia PDF Downloads 134
23380 Study and Analysis of Optical Intersatellite Links

Authors: Boudene Maamar, Xu Mai

Abstract:

Optical Intersatellite Links (OISLs) are wireless communications using optical signals to interconnect satellites. It is expected to be the next generation wireless communication technology according to its inherent characteristics like: an increased bandwidth, a high data rate, a data transmission security, an immunity to interference, and an unregulated spectrum etc. Optical space links are the best choice for the classical communication schemes due to its distinctive properties; high frequency, small antenna diameter and lowest transmitted power, which are critical factors to define a space communication. This paper discusses the development of free space technology and analyses the parameters and factors to establish a reliable intersatellite links using an optical signal to exchange data between satellites.

Keywords: optical intersatellite links, optical wireless communications, free space optical communications, next generation wireless communication

Procedia PDF Downloads 449
23379 Sunshine Hour as a Factor to Maintain the Circadian Rhythm of Heart Rate: Analysis of Ambulatory ECG and Weather Big Data

Authors: Emi Yuda, Yutaka Yoshida, Junichiro Hayano

Abstract:

Distinct circadian rhythm of activity, i.e., high activity during the day and deep rest at night are a typical feature of a healthy lifestyle. Exposure to the skylight is thought to be an important factor to increase arousal level and maintain normal circadian rhythm. To examine whether sunshine hours influence the day-night contract of activity, we analyzed the relationship between 24-hour heart rate (HR) and weather data of the recording day. We analyzed data in 36,500 males and 49,854 females of Allostatic State Mapping by Ambulatory ECG Repository (ALLSTAR) database in Japan. Median (IQR) sunshine duration was 5.3 (2.8-7.9) hr. While sunshine hours had only modest effects of increasing 24-hour average HR in either gender (P=0.0282 and 0.0248 for male and female) and no significant effects on nighttime HR in either gender, it increased daytime HR (P = 0.0007 and 0.0015) and day-night HF difference in both genders (P < 0.0001 for both) even after adjusting for the effects of average temperature, atmospheric pressure, and humidity. Our observations support for the hypothesis that longer sunshine hours enhance circadian rhythm of activity.

Keywords: big data, circadian rhythm, heart rate, sunshine

Procedia PDF Downloads 166
23378 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 199
23377 Hybridized Approach for Distance Estimation Using K-Means Clustering

Authors: Ritu Vashistha, Jitender Kumar

Abstract:

Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.

Keywords: ant colony optimization, data clustering, centroids, data mining, k-means

Procedia PDF Downloads 129
23376 Digital Twin for University Campus: Workflow, Applications and Benefits

Authors: Frederico Fialho Teixeira, Islam Mashaly, Maryam Shafiei, Jurij Karlovsek

Abstract:

The ubiquity of data gathering and smart technologies, advancements in virtual technologies, and the development of the internet of things (IoT) have created urgent demands for the development of frameworks and efficient workflows for data collection, visualisation, and analysis. Digital twin, in different scales of the city into the building, allows for bringing together data from different sources to generate fundamental and illuminating insights for the management of current facilities and the lifecycle of amenities as well as improvement of the performance of current and future designs. Over the past two decades, there has been growing interest in the topic of digital twin and their applications in city and building scales. Most such studies look at the urban environment through a homogeneous or generalist lens and lack specificity in particular characteristics or identities, which define an urban university campus. Bridging this knowledge gap, this paper offers a framework for developing a digital twin for a university campus that, with some modifications, could provide insights for any large-scale digital twin settings like towns and cities. It showcases how currently unused data could be purposefully combined, interpolated and visualised for producing analysis-ready data (such as flood or energy simulations or functional and occupancy maps), highlighting the potential applications of such a framework for campus planning and policymaking. The research integrates campus-level data layers into one spatial information repository and casts light on critical data clusters for the digital twin at the campus level. The paper also seeks to raise insightful and directive questions on how digital twin for campus can be extrapolated to city-scale digital twin. The outcomes of the paper, thus, inform future projects for the development of large-scale digital twin as well as urban and architectural researchers on potential applications of digital twin in future design, management, and sustainable planning, to predict problems, calculate risks, decrease management costs, and improve performance.

Keywords: digital twin, smart campus, framework, data collection, point cloud

Procedia PDF Downloads 71
23375 Impact of Job Burnout on Job Satisfaction and Job Performance of Front Line Employees in Bank: Moderating Role of Hope and Self-Efficacy

Authors: Huma Khan, Faiza Akhtar

Abstract:

The present study investigates the effects of burnout toward job performance and job satisfaction with the moderating role of hope and self-efficacy. Findings from 310 frontline employees of Pakistani commercial banks (Lahore, Karachi & Islamabad) disclosed burnout has negative significant effects on job performance and job satisfaction. Simple random sampling technique was used to collect data and inferential statistics were applied to analyzed the data. However, results disclosed no moderation effect of hope on burnout, job performance or with job satisfaction. Moreover, Data significantly supported the moderation effect of self-efficacy. Study further shed light on the development of psychological capital. Importance of the implication of the current finding is discussed.

Keywords: burnout, hope, job performance, job satisfaction, psychological capital, self-efficacy

Procedia PDF Downloads 142
23374 Obstacle Classification Method Based on 2D LIDAR Database

Authors: Moohyun Lee, Soojung Hur, Yongwan Park

Abstract:

In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.

Keywords: obstacle, classification, database, LIDAR, segmentation, intensity

Procedia PDF Downloads 355
23373 Body Farming in India and Asia

Authors: Yogesh Kumar, Adarsh Kumar

Abstract:

A body farm is a research facility where research is done on forensic investigation and medico-legal disciplines like forensic entomology, forensic pathology, forensic anthropology, forensic archaeology, and related areas of forensic veterinary. All the research is done to collect data on the rate of decomposition (animal and human) and forensically important insects to assist in crime detection. The data collected is used by forensic pathologists, forensic experts, and other experts for the investigation of crime cases and further research. The research work includes different conditions of a dead body like fresh, bloating, decay, dry, and skeleton, and data on local insects which depends on the climatic conditions of the local areas of that country. Therefore, it is the need of time to collect appropriate data in managed conditions with a proper set-up in every country. Hence, it is the duty of the scientific community of every country to establish/propose such facilities for justice and social management. The body farms are also used for training of police, military, investigative dogs, and other agencies. At present, only four countries viz. U.S., Australia, Canada, and Netherlands have body farms and related facilities in organised manner. There is no body farm in Asia also. In India, we have been trying to establish a body farm in A&N Islands that is near Singapore, Malaysia, and some other Asian countries. In view of the above, it becomes imperative to discuss the matter with Asian countries to collect the data on decomposition in a proper manner by establishing a body farm. We can also share the data, knowledge, and expertise to collaborate with one another to make such facilities better and have good scientific relations to promote science and explore ways of investigation at the world level.

Keywords: body farm, rate of decomposition, forensically important flies, time since death

Procedia PDF Downloads 88
23372 The Impact of Inflation Rate and Interest Rate on Islamic and Conventional Banking in Afghanistan

Authors: Tareq Nikzad

Abstract:

Since the first bank was established in 1933, Afghanistan's banking sector has seen a number of variations but hasn't been able to grow to its full potential because of the civil war. The implementation of dual banks in Afghanistan is investigated in this study in relation to the effects of inflation and interest rates. This research took data from World Bank Data (WBD) over a period of nineteen years. For the banking sector, inflation, which is the general rise in prices of goods and services over time, presents considerable difficulties. The objectives of this research are to analyze the effect of inflation and interest rates on conventional and Islamic banks in Afghanistan, identify potential differences between these two banking models, and provide insights for policymakers and practitioners. A mixed-methods approach is used in the research to analyze quantitative data and qualitatively examine the unique difficulties that banks in Afghanistan's economic atmosphere encounter. The findings contribute to the understanding of the relationship between interest rate, inflation rate, and the performance of both banking systems in Afghanistan. The paper concludes with recommendations for policymakers and banking institutions to enhance the stability and growth of the banking sector in Afghanistan. Interest is described as "a prefixed rate for use or borrowing of money" from an Islamic perspective. This "prefixed rate," known in Islamic economics as "riba," has been described as "something undesirable." Furthermore, by using the time series regression data technique on the annual data from 2003 to 2021, this research examines the effect of CPI inflation rate and interest rate of Banking in Afghanistan.

Keywords: inflation, Islamic banking, conventional banking, interest, Afghanistan, impact

Procedia PDF Downloads 73
23371 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.

Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria

Procedia PDF Downloads 378
23370 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics

Authors: Janne Engblom, Elias Oikarinen

Abstract:

A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.

Keywords: dynamic model, fixed effects, panel data, price dynamics

Procedia PDF Downloads 1512
23369 Blockchain-Based Assignment Management System

Authors: Amogh Katti, J. Sai Asritha, D. Nivedh, M. Kalyan Srinivas, B. Somnath Chakravarthi

Abstract:

Today's modern education system uses Learning Management System (LMS) portals for the scoring and grading of student performances, to maintain student records, and teachers are instructed to accept assignments through online submissions of .pdf,.doc,.ppt, etc. There is a risk of data tampering in the traditional portals; we will apply the Blockchain model instead of this traditional model to avoid data tampering and also provide a decentralized mechanism for overall fairness. Blockchain technology is a better and also recommended model because of the following features: consensus mechanism, decentralized system, cryptographic encryption, smart contracts, Ethereum blockchain. The proposed system ensures data integrity and tamper-proof assignment submission and grading, which will be helpful for both students and also educators.

Keywords: education technology, learning management system, decentralized applications, blockchain

Procedia PDF Downloads 85
23368 Trend Analysis of Africa’s Entrepreneurial Framework Conditions

Authors: Sheng-Hung Chen, Grace Mmametena Mahlangu, Hui-Cheng Wang

Abstract:

This study aims to explore the trends of the Entrepreneurial Framework Conditions (EFCs) in the five African regions. The Global Entrepreneur Monitor (GEM) is the primary source of data. The data drawn were organized into a panel (2000-2021) and obtained from the National Expert Survey (NES) databases as harmonized by the (GEM). The Methodology used is descriptive and uses mainly charts and tables; this is in line with the approach used by the GEM. The GEM draws its data from the National Expert Survey (NES). The survey by the NES is administered to experts in each country. The GEM collects entrepreneurship data specific to each country. It provides information about entrepreneurial ecosystems and their impact on entrepreneurship. The secondary source is from the literature review. This study focuses on the following GEM indicators: Financing for Entrepreneurs, Government support and Policies, Taxes and Bureaucracy, Government programs, Basic School Entrepreneurial Education and Training, Post school Entrepreneurial Education and Training, R&D Transfer, Commercial And Professional Infrastructure, Internal Market Dynamics, Internal Market Openness, Physical and Service Infrastructure, and Cultural And Social Norms, based on GEM Report 2020/21. The limitation of the study is the lack of updated data from some countries. Countries have to fund their own regional studies; African countries do not regularly participate due to a lack of resources.

Keywords: trend analysis, entrepreneurial framework conditions (EFCs), African region, government programs

Procedia PDF Downloads 74
23367 Access to Apprenticeships and the Impact of Individual and School Level Characteristics

Authors: Marianne Dæhlen

Abstract:

Periods of apprenticeships are characteristic of many vocational educational training (VET) systems. In many countries, becoming a skilled worker implies that the journey starts with an application for apprenticeships at a company or another relevant training establishment. In Norway, where this study is conducted, VET students start their journey with two years of school-based training before applying for two years of apprenticeship. Previous research has shown that access to apprenticeships differs by family background (socio-economic, immigrant, etc.), gender, school grades, and region. The question we raise in this study is whether the status, reputation, or position of the vocational school contributes to VET students’ access to apprenticeships. Data and methods: Register data containing information about schools’ and VET students’ characteristics will be analyzed in multilevel regression analyses. At the school level, the data will contain information on school size, shares of immigrants and/or share of male/female students, and grade requirements for admission. At the VET-student level, the register contains information on e.g., gender, school grades, educational program/trade, obtaining apprenticeship or not. The data set comprises about 3,000 students. Results: The register data is expected to be received in November 2024 and consequently, any results are not present at the point of this call. The planned article is part of a larger research project granted from the Norwegian Research Council and will, accordingly to the plan, start up in December 2024.

Keywords: apprenticeships, VET-students’ characteristics, vocational schools, quantitative methods

Procedia PDF Downloads 14
23366 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline Maria Ribeiro Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.

Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer

Procedia PDF Downloads 304
23365 A Review of Spatial Analysis as a Geographic Information Management Tool

Authors: Chidiebere C. Agoha, Armstong C. Awuzie, Chukwuebuka N. Onwubuariri, Joy O. Njoku

Abstract:

Spatial analysis is a field of study that utilizes geographic or spatial information to understand and analyze patterns, relationships, and trends in data. It is characterized by the use of geographic or spatial information, which allows for the analysis of data in the context of its location and surroundings. It is different from non-spatial or aspatial techniques, which do not consider the geographic context and may not provide as complete of an understanding of the data. Spatial analysis is applied in a variety of fields, which includes urban planning, environmental science, geosciences, epidemiology, marketing, to gain insights and make decisions about complex spatial problems. This review paper explores definitions of spatial analysis from various sources, including examples of its application and different analysis techniques such as Buffer analysis, interpolation, and Kernel density analysis (multi-distance spatial cluster analysis). It also contrasts spatial analysis with non-spatial analysis.

Keywords: aspatial technique, buffer analysis, epidemiology, interpolation

Procedia PDF Downloads 326
23364 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm

Authors: Thanh Noi Phan, Martin Kappas, Jan Degener

Abstract:

The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.

Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam

Procedia PDF Downloads 390
23363 Constructing the Density of States from the Parallel Wang Landau Algorithm Overlapping Data

Authors: Arman S. Kussainov, Altynbek K. Beisekov

Abstract:

This work focuses on building an efficient universal procedure to construct a single density of states from the multiple pieces of data provided by the parallel implementation of the Wang Landau Monte Carlo based algorithm. The Ising and Pott models were used as the examples of the two-dimensional spin lattices to construct their densities of states. Sampled energy space was distributed between the individual walkers with certain overlaps. This was made to include the latest development of the algorithm as the density of states replica exchange technique. Several factors of immediate importance for the seamless stitching process have being considered. These include but not limited to the speed and universality of the initial parallel algorithm implementation as well as the data post-processing to produce the expected smooth density of states.

Keywords: density of states, Monte Carlo, parallel algorithm, Wang Landau algorithm

Procedia PDF Downloads 414
23362 Social Enterprise Concept in Sustaining Agro-Industry Development in Indonesia: Case Study of Yourgood Social Business

Authors: Koko Iwan Agus Kurniawan, Dwi Purnomo, Anas Bunyamin, Arif Rahman Jaya

Abstract:

Fruters model is a concept of technopreneurship-based on empowerment, in which technology research results were designed to create high value-added products and implemented as a locomotive of collaborative empowerment; thereby, the impact was widely spread. This model still needs to be inventoried and validated concerning the influenced variables in the business growth process. Model validation accompanied by mapping was required to be applicable to Small Medium Enterprises (SMEs) agro-industry based on sustainable social business and existing real cases. This research explained the empowerment model of Yourgood, an SME, which emphasized on empowering the farmers/ breeders in farmers in rural areas, Cipageran, Cimahi, to housewives in urban areas, Bandung, West Java, Indonesia. This research reviewed some works of literature discussing the agro-industrial development associated with the empowerment and social business process and gained a unique business model picture with the social business platform as well. Through the mapped business model, there were several advantages such as technology acquisition, independence, capital generation, good investment growth, strengthening of collaboration, and improvement of social impacts that can be replicated on other businesses. This research used analytical-descriptive research method consisting of qualitative analysis with design thinking approach and that of quantitative with the AHP (Analytical Hierarchy Process). Based on the results, the development of the enterprise’s process was highly affected by supplying farmers with the score of 0.248 out of 1, being the most valuable for the existence of the enterprise. It was followed by university (0.178), supplying farmers (0.153), business actors (0.128), government (0.100), distributor (0.092), techno-preneurship laboratory (0.069), banking (0.033), and Non-Government Organization (NGO) (0.031).

Keywords: agro-industry, small medium enterprises, empowerment, design thinking, AHP, business model canvas, social business

Procedia PDF Downloads 171
23361 Thick Data Analytics for Learning Cataract Severity: A Triplet Loss Siamese Neural Network Model

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Diagnosing cataract severity is an important factor in deciding to undertake surgery. It is usually conducted by an ophthalmologist or through taking a variety of fundus photography that needs to be examined by the ophthalmologist. This paper carries out an investigation using a Siamese neural net that can be trained with small anchor samples to score cataract severity. The model used in this paper is based on a triplet loss function that takes the ophthalmologist best experience in rating positive and negative anchors to a specific cataract scaling system. This approach that takes the heuristics of the ophthalmologist is generally called the thick data approach, which is a kind of machine learning approach that learn from a few shots. Clinical Relevance: The lens of the eye is mostly made up of water and proteins. A cataract occurs when these proteins at the eye lens start to clump together and block lights causing impair vision. This research aims at employing thick data machine learning techniques to rate the severity of the cataract using Siamese neural network.

Keywords: thick data analytics, siamese neural network, triplet-loss model, few shot learning

Procedia PDF Downloads 114
23360 Case Study Analysis for Driver's Company in the Transport Sector with the Help of Data Mining

Authors: Diana Katherine Gonzalez Galindo, David Rolando Suarez Mora

Abstract:

With this study, we used data mining as a new alternative of the solution to evaluate the comments of the customers in order to find a pattern that helps us to determine some behaviors to reduce the deactivation of the partners of the LEVEL app. In one of the greatest business created in the last times, the partners are being affected due to an internal process that compensates the customer for a bad experience, but these comments could be false towards the driver, that’s why we made an investigation to collect information to restructure this process, many partners have been disassociated due to this internal process and many of them refuse the comments given by the customer. The main methodology used in this case study is the observation, we recollect information in real time what gave us the opportunity to see the most common issues to get the most accurate solution. With this new process helped by data mining, we could get a prediction based on the behaviors of the customer and some basic data recollected such as the age, the gender, and others; this could help us in future to improve another process. This investigation gives more opportunities to the partner to keep his account active even if the customer writes a message through the app. The term is trying to avoid a recession of drivers in the future offering improving in the processes, at the same time we are in search of stablishing a strategy which benefits both the app’s managers and the associated driver.

Keywords: agent, driver, deactivation, rider

Procedia PDF Downloads 283
23359 Image Compression Using Block Power Method for SVD Decomposition

Authors: El Asnaoui Khalid, Chawki Youness, Aksasse Brahim, Ouanan Mohammed

Abstract:

In these recent decades, the important and fast growth in the development and demand of multimedia products is contributing to an insufficient in the bandwidth of device and network storage memory. Consequently, the theory of data compression becomes more significant for reducing the data redundancy in order to save more transfer and storage of data. In this context, this paper addresses the problem of the lossless and the near-lossless compression of images. This proposed method is based on Block SVD Power Method that overcomes the disadvantages of Matlab's SVD function. The experimental results show that the proposed algorithm has a better compression performance compared with the existing compression algorithms that use the Matlab's SVD function. In addition, the proposed approach is simple and can provide different degrees of error resilience, which gives, in a short execution time, a better image compression.

Keywords: image compression, SVD, block SVD power method, lossless compression, near lossless

Procedia PDF Downloads 388
23358 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 143
23357 Investigation of the Jupiter’s Galilean Moons

Authors: Revaz Chigladze

Abstract:

The purpose of the research is to investigate the surfaces of Jupiter's Galilean moons, namely which moon has the most uniform surface among them, what is the difference between the front (in the direction of motion) and the back sides of each moon's surface, as well as the temporal variations of the moons. Since 1981, the E. Kharadze National Astrophysical Observatory of Georgia has been conducting polarimetric (P) and photometric (M) observations of Jupiter's Galilean moons with telescopes of different diameters (40 cm and 125 cm) and the polarimeter ASEP-78 in combination with them and the latest generation photometer with a polarimeter and modern light receiver SBIG. As it turns out from the analysis of the observed material, the parameters P and M depend on α-the phase angle of the moon (satellite), L- the orbital latitude of the moon (satellite), λ- the wavelength, and t - the period of observation, i.e., P = P (α, L, λ , t), and similarly M = M (α, L, λ. , t). Based on the analysis of the observed material, the following was studied: Jupiter's Galilean moons: dependence of the magnitude and phase angle of the degree of linear polarization for different wavelengths; Dependence of the degree of polarization and the orbital longitude; dependence between the magnitude of the degree of polarization and the wavelength; time dependence of the degree of polarization and the dependence between photometric and polarimetric characteristics (including establishing correlation). From the analysis of the obtained results, we get: The magnitude of the degree of polarization of Jupiter's Galilean moons near the opposition significantly differs from zero. Europa appears to have the most uniform surface, and Callisto the least uniform. Time variations are most characteristic of Io, which confirms the presence of volcanic activity on its surface. Based on the observed material, it can be seen that the intensity of light reflected from the front hemisphere of the first three moons: Io, Europa, and Ganymede, is less than the intensity of light reflected from the rear hemisphere, and in the case of the Callisto it is the opposite. The paper provides a convincing (natural, real) explanation of this fact.

Keywords: Galilean moons, polarization, degree of polarization, photometry, front and rear hemispheres

Procedia PDF Downloads 104
23356 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification

Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos

Abstract:

Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.

Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology

Procedia PDF Downloads 150
23355 Multichannel Analysis of the Surface Waves of Earth Materials in Some Parts of Lagos State, Nigeria

Authors: R. B. Adegbola, K. F. Oyedele, L. Adeoti

Abstract:

We present a method that utilizes Multi-channel Analysis of Surface Waves, which was used to measure shear wave velocities with a view to establishing the probable causes of road failure, subsidence and weakening of structures in some Local Government Area, Lagos, Nigeria. Multi channel Analysis of Surface waves (MASW) data were acquired using 24-channel seismograph. The acquired data were processed and transformed into two-dimensional (2-D) structure reflective of depth and surface wave velocity distribution within a depth of 0–15m beneath the surface using SURFSEIS software. The shear wave velocity data were compared with other geophysical/borehole data that were acquired along the same profile. The comparison and correlation illustrates the accuracy and consistency of MASW derived-shear wave velocity profiles. Rigidity modulus and N-value were also generated. The study showed that the low velocity/very low velocity are reflective of organic clay/peat materials and thus likely responsible for the failed, subsidence/weakening of structures within the study areas.

Keywords: seismograph, road failure, rigidity modulus, N-value, subsidence

Procedia PDF Downloads 366