Search results for: data combining
23646 Exploiting Kinetic and Kinematic Data to Plot Cyclograms for Managing the Rehabilitation Process of BKAs by Applying Neural Networks
Authors: L. Parisi
Abstract:
Kinematic data wisely correlate vector quantities in space to scalar parameters in time to assess the degree of symmetry between the intact limb and the amputated limb with respect to a normal model derived from the gait of control group participants. Furthermore, these particular data allow a doctor to preliminarily evaluate the usefulness of a certain rehabilitation therapy. Kinetic curves allow the analysis of ground reaction forces (GRFs) to assess the appropriateness of human motion. Electromyography (EMG) allows the analysis of the fundamental lower limb force contributions to quantify the level of gait asymmetry. However, the use of this technological tool is expensive and requires patient’s hospitalization. This research work suggests overcoming the above limitations by applying artificial neural networks.Keywords: kinetics, kinematics, cyclograms, neural networks, transtibial amputation
Procedia PDF Downloads 44323645 Urbanization and Built Environment: Impacts of Squatter Slums on Degeneration of Urban Built Environment, a Case Study of Karachi
Authors: Mansoor Imam, Amber Afshan, Sumbul Mujeeb, Kamran Gill
Abstract:
An investigative approach has been made to study the quality of living prevailing in the squatter slums of Karachi city that is influencing the urbanization trends and environmental degeneration of built environment. The paper identifies the issues and aspects that have directly and indirectly impacted the degeneration owing to inadequate basic infrastructural amenities, substandard housing, overcrowding, poor ventilation in homes and workplaces, and noncompliance with building bye-laws and regulations, etc. Primarily, secondary data has been critically examined and analyzed which was however not limited to census data, demographic / socioeconomic data, official documents and other relevant secondary data were obtained from existing literature and GIS. It is observed that the poor and sub-standard housing / living quality have serious adverse impacts on the environment and the health of city residents. Hence strategies for improving the quality of built environment for sustainable living are mandated. It is, therefore, imperative to check and prevent further degradation and promote harmonious living and sustainable urbanization.Keywords: squatter slums, urbanization, degenerations, living quality, built environment
Procedia PDF Downloads 39023644 Assessment of the Contribution of Geographic Information System Technology in Non Revenue Water: Case Study Dar Es Salaam Water and Sewerage Authority Kawe - Mzimuni Street
Authors: Victor Pesco Kassa
Abstract:
This research deals with the assessment of the contribution of GIS Technology in NRW. This research was conducted at Dar, Kawe Mzimuni Street. The data collection was obtained from existing source which is DAWASA HQ. The interpretation of the data was processed by using ArcGIS software. The data collected from the existing source reveals a good coverage of DAWASA’s water network at Mzimuni Street. Most of residents are connected to the DAWASA’s customer service. Also the collected data revealed that by using GIS DAWASA’s customer Geodatabase has been improved. Through GIS we can prepare customer location map purposely for site surveying also this map will be able to show different type of customer that are connected to DAWASA’s water service. This is a perfect contribution of the GIS Technology to address and manage the problem of NRW in DAWASA. Finally, the study recommends that the same study should be conducted in other DAWASA’s zones such as Temeke, Boko and Bagamoyo not only at Kawe Mzimuni Street. Through this study it is observed that ArcGIS software can offer powerful tools for managing and processing information geographically and in water and sanitation authorities such as DAWASA.Keywords: DAWASA, NRW, Esri, EURA, ArcGIS
Procedia PDF Downloads 8123643 Robustified Asymmetric Logistic Regression Model for Global Fish Stock Assessment
Authors: Osamu Komori, Shinto Eguchi, Hiroshi Okamura, Momoko Ichinokawa
Abstract:
The long time-series data on population assessments are essential for global ecosystem assessment because the temporal change of biomass in such a database reflects the status of global ecosystem properly. However, the available assessment data usually have limited sample sizes and the ratio of populations with low abundance of biomass (collapsed) to those with high abundance (non-collapsed) is highly imbalanced. To allow for the imbalance and uncertainty involved in the ecological data, we propose a binary regression model with mixed effects for inferring ecosystem status through an asymmetric logistic model. In the estimation equation, we observe that the weights for the non-collapsed populations are relatively reduced, which in turn puts more importance on the small number of observations of collapsed populations. Moreover, we extend the asymmetric logistic regression model using propensity score to allow for the sample biases observed in the labeled and unlabeled datasets. It robustified the estimation procedure and improved the model fitting.Keywords: double robust estimation, ecological binary data, mixed effect logistic regression model, propensity score
Procedia PDF Downloads 26423642 Research on Hangzhou Commercial Center System Based on Point of Interest Data
Authors: Chen Wang, Qiuxiao Chen
Abstract:
With the advent of the information age and the era of big data, urban planning research is no longer satisfied with the analysis and application of traditional data. Because of the limitations of traditional urban commercial center system research, big data provides new opportunities for urban research. Therefore, based on the quantitative evaluation method of big data, the commercial center system of the main city of Hangzhou is analyzed and evaluated, and the scale and hierarchical structure characteristics of the urban commercial center system are studied. In order to make up for the shortcomings of the existing POI extraction method, it proposes a POI extraction method based on adaptive adjustment of search window, which can accurately and efficiently extract the POI data of commercial business in the main city of Hangzhou. Through the visualization and nuclear density analysis of the extracted Point of Interest (POI) data, the current situation of the commercial center system in the main city of Hangzhou is evaluated. Then it compares with the commercial center system structure of 'Hangzhou City Master Plan (2001-2020)', analyzes the problems existing in the planned urban commercial center system, and provides corresponding suggestions and optimization strategy for the optimization of the planning of Hangzhou commercial center system. Then get the following conclusions: The status quo of the commercial center system in the main city of Hangzhou presents a first-level main center, a two-level main center, three third-level sub-centers, and multiple community-level business centers. Generally speaking, the construction of the main center in the commercial center system is basically up to standard, and there is still a big gap in the construction of the sub-center and the regional-level commercial center, further construction is needed. Therefore, it proposes an optimized hierarchical functional system, organizes commercial centers in an orderly manner; strengthens the central radiation to drive surrounding areas; implements the construction guidance of the center, effectively promotes the development of group formation and further improves the commercial center system structure of the main city of Hangzhou.Keywords: business center system, business format, main city of Hangzhou, POI extraction method
Procedia PDF Downloads 13923641 Stakeholder Analysis of Agricultural Drone Policy: A Case Study of the Agricultural Drone Ecosystem of Thailand
Authors: Thanomsin Chakreeves, Atichat Preittigun, Ajchara Phu-ang
Abstract:
This paper presents a stakeholder analysis of agricultural drone policies that meet the government's goal of building an agricultural drone ecosystem in Thailand. Firstly, case studies from other countries are reviewed. The stakeholder analysis method and qualitative data from the interviews are then presented including data from the Institute of Innovation and Management, the Office of National Higher Education Science Research and Innovation Policy Council, agricultural entrepreneurs and farmers. Study and interview data are then employed to describe the current ecosystem and to guide the implementation of agricultural drone policies that are suitable for the ecosystem of Thailand. Finally, policy recommendations are then made that the Thai government should adopt in the future.Keywords: drone public policy, drone ecosystem, policy development, agricultural drone
Procedia PDF Downloads 14523640 Study and Analysis of Optical Intersatellite Links
Authors: Boudene Maamar, Xu Mai
Abstract:
Optical Intersatellite Links (OISLs) are wireless communications using optical signals to interconnect satellites. It is expected to be the next generation wireless communication technology according to its inherent characteristics like: an increased bandwidth, a high data rate, a data transmission security, an immunity to interference, and an unregulated spectrum etc. Optical space links are the best choice for the classical communication schemes due to its distinctive properties; high frequency, small antenna diameter and lowest transmitted power, which are critical factors to define a space communication. This paper discusses the development of free space technology and analyses the parameters and factors to establish a reliable intersatellite links using an optical signal to exchange data between satellites.Keywords: optical intersatellite links, optical wireless communications, free space optical communications, next generation wireless communication
Procedia PDF Downloads 44523639 Sunshine Hour as a Factor to Maintain the Circadian Rhythm of Heart Rate: Analysis of Ambulatory ECG and Weather Big Data
Authors: Emi Yuda, Yutaka Yoshida, Junichiro Hayano
Abstract:
Distinct circadian rhythm of activity, i.e., high activity during the day and deep rest at night are a typical feature of a healthy lifestyle. Exposure to the skylight is thought to be an important factor to increase arousal level and maintain normal circadian rhythm. To examine whether sunshine hours influence the day-night contract of activity, we analyzed the relationship between 24-hour heart rate (HR) and weather data of the recording day. We analyzed data in 36,500 males and 49,854 females of Allostatic State Mapping by Ambulatory ECG Repository (ALLSTAR) database in Japan. Median (IQR) sunshine duration was 5.3 (2.8-7.9) hr. While sunshine hours had only modest effects of increasing 24-hour average HR in either gender (P=0.0282 and 0.0248 for male and female) and no significant effects on nighttime HR in either gender, it increased daytime HR (P = 0.0007 and 0.0015) and day-night HF difference in both genders (P < 0.0001 for both) even after adjusting for the effects of average temperature, atmospheric pressure, and humidity. Our observations support for the hypothesis that longer sunshine hours enhance circadian rhythm of activity.Keywords: big data, circadian rhythm, heart rate, sunshine
Procedia PDF Downloads 16123638 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector
Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau
Abstract:
Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement
Procedia PDF Downloads 19723637 Promoting Students' Worldview Through Integrative Education in the Process of Teaching Biology in Grades 11 and 12 of High School
Authors: Saule Shazhanbayeva, Denise van der Merwe
Abstract:
Study hypothesis: Nazarbayev Intellectual School of Kyzylorda’s Biology teachers can use STEM-integrated learning to improve students' problem-solving ability and responsibility as global citizens. The significance of this study is to indicate how the use of STEM integrative learning during Biology lessons could contribute to forming globally-minded students who are responsible community members. For the purposes of this study, worldview is defined as a view that is broader than the country of Kazakhstan, allowing students to see the significance of their scientific contributions to the world as global citizens. The context of worldview specifically indicates that most students have never traveled outside of their city or region within Kazakhstan. In order to broaden student understanding, it is imperative that students are exposed to different world views and contrasting ideas within the educational setting of Biology as the science being used for the research. This exposure promulgates students understanding of the significance they have as global citizens alongside the obligations which would rest on them as scientifically minded global citizens. Integrative learning should be Biological Science - with Technology and engineering in the form of problem-solving, and Mathematics to allow improved problem-solving skills to develop within the students of Nazarbayev Intellectual School (NIS) of Kyzylorda. The school's vision is to allow students to realise their role as global citizens and become responsible community members. STEM allows integrations by combining four subject skills to solve topical problems designed by educators. The methods used are based on qualitative analysis: for students’ performance during a problem-solution scenario; and Biology teacher interviews to ascertain their understanding of STEM implementation and willingness to integrate it into current lessons. The research indicated that NIS is ready for a shift into STEM lessons to promote globally responsible students. The only additional need is for proper STEM integrative lesson method training for teachers.Keywords: global citizen, STEM, Biology, high-school
Procedia PDF Downloads 7023636 Hybridized Approach for Distance Estimation Using K-Means Clustering
Authors: Ritu Vashistha, Jitender Kumar
Abstract:
Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.Keywords: ant colony optimization, data clustering, centroids, data mining, k-means
Procedia PDF Downloads 12723635 Digital Twin for University Campus: Workflow, Applications and Benefits
Authors: Frederico Fialho Teixeira, Islam Mashaly, Maryam Shafiei, Jurij Karlovsek
Abstract:
The ubiquity of data gathering and smart technologies, advancements in virtual technologies, and the development of the internet of things (IoT) have created urgent demands for the development of frameworks and efficient workflows for data collection, visualisation, and analysis. Digital twin, in different scales of the city into the building, allows for bringing together data from different sources to generate fundamental and illuminating insights for the management of current facilities and the lifecycle of amenities as well as improvement of the performance of current and future designs. Over the past two decades, there has been growing interest in the topic of digital twin and their applications in city and building scales. Most such studies look at the urban environment through a homogeneous or generalist lens and lack specificity in particular characteristics or identities, which define an urban university campus. Bridging this knowledge gap, this paper offers a framework for developing a digital twin for a university campus that, with some modifications, could provide insights for any large-scale digital twin settings like towns and cities. It showcases how currently unused data could be purposefully combined, interpolated and visualised for producing analysis-ready data (such as flood or energy simulations or functional and occupancy maps), highlighting the potential applications of such a framework for campus planning and policymaking. The research integrates campus-level data layers into one spatial information repository and casts light on critical data clusters for the digital twin at the campus level. The paper also seeks to raise insightful and directive questions on how digital twin for campus can be extrapolated to city-scale digital twin. The outcomes of the paper, thus, inform future projects for the development of large-scale digital twin as well as urban and architectural researchers on potential applications of digital twin in future design, management, and sustainable planning, to predict problems, calculate risks, decrease management costs, and improve performance.Keywords: digital twin, smart campus, framework, data collection, point cloud
Procedia PDF Downloads 6623634 Impact of Job Burnout on Job Satisfaction and Job Performance of Front Line Employees in Bank: Moderating Role of Hope and Self-Efficacy
Authors: Huma Khan, Faiza Akhtar
Abstract:
The present study investigates the effects of burnout toward job performance and job satisfaction with the moderating role of hope and self-efficacy. Findings from 310 frontline employees of Pakistani commercial banks (Lahore, Karachi & Islamabad) disclosed burnout has negative significant effects on job performance and job satisfaction. Simple random sampling technique was used to collect data and inferential statistics were applied to analyzed the data. However, results disclosed no moderation effect of hope on burnout, job performance or with job satisfaction. Moreover, Data significantly supported the moderation effect of self-efficacy. Study further shed light on the development of psychological capital. Importance of the implication of the current finding is discussed.Keywords: burnout, hope, job performance, job satisfaction, psychological capital, self-efficacy
Procedia PDF Downloads 13923633 A Gold-Based Nanoformulation for Delivery of the CRISPR/Cas9 Ribonucleoprotein for Genome Editing
Authors: Soultana Konstantinidou, Tiziana Schmidt, Elena Landi, Alessandro De Carli, Giovanni Maltinti, Darius Witt, Alicja Dziadosz, Agnieszka Lindstaedt, Michele Lai, Mauro Pistello, Valentina Cappello, Luciana Dente, Chiara Gabellini, Piotr Barski, Vittoria Raffa
Abstract:
CRISPR/Cas9 technology has gained the interest of researchers in the field of biotechnology for genome editing. Since its discovery as a microbial adaptive immune defense, this system has been widely adopted and is acknowledged for having a variety of applications. However, critical barriers related to safety and delivery are persisting. Here, we propose a new concept of genome engineering, which is based on a nano-formulation of Cas9. The Cas9 enzyme was conjugated to a gold nanoparticle (AuNP-Cas9). The AuNP-Cas9 maintained its cleavage efficiency in vitro, to the same extent as the ribonucleoprotein, including non-conjugated Cas9 enzyme, and showed high gene editing efficiency in vivo in zebrafish embryos. Since CRISPR/Cas9 technology is extensively used in cancer research, melanoma was selected as a validation target. Cell studies were performed in A375 human melanoma cells. Particles per se had no impact on cell metabolism and proliferation. Intriguingly, the AuNP-Cas9 internalized spontaneously in cells and localized as a single particle in the cytoplasm and organelles. More importantly, the AuNP-Cas9 showed a high nuclear localization signal. The AuNP-Cas9, overcoming the delivery difficulties of Cas9, could be used in cellular biology and localization studies. Taking advantage of the plasmonic properties of gold nanoparticles, this technology could potentially be a bio-tool for combining gene editing and photothermal therapy in cancer cells. Further work will be focused on intracellular interactions of the nano-formulation and characterization of the optical properties.Keywords: CRISPR/Cas9, gene editing, gold nanoparticles, nanotechnology
Procedia PDF Downloads 9823632 Obstacle Classification Method Based on 2D LIDAR Database
Authors: Moohyun Lee, Soojung Hur, Yongwan Park
Abstract:
In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.Keywords: obstacle, classification, database, LIDAR, segmentation, intensity
Procedia PDF Downloads 34723631 Body Farming in India and Asia
Authors: Yogesh Kumar, Adarsh Kumar
Abstract:
A body farm is a research facility where research is done on forensic investigation and medico-legal disciplines like forensic entomology, forensic pathology, forensic anthropology, forensic archaeology, and related areas of forensic veterinary. All the research is done to collect data on the rate of decomposition (animal and human) and forensically important insects to assist in crime detection. The data collected is used by forensic pathologists, forensic experts, and other experts for the investigation of crime cases and further research. The research work includes different conditions of a dead body like fresh, bloating, decay, dry, and skeleton, and data on local insects which depends on the climatic conditions of the local areas of that country. Therefore, it is the need of time to collect appropriate data in managed conditions with a proper set-up in every country. Hence, it is the duty of the scientific community of every country to establish/propose such facilities for justice and social management. The body farms are also used for training of police, military, investigative dogs, and other agencies. At present, only four countries viz. U.S., Australia, Canada, and Netherlands have body farms and related facilities in organised manner. There is no body farm in Asia also. In India, we have been trying to establish a body farm in A&N Islands that is near Singapore, Malaysia, and some other Asian countries. In view of the above, it becomes imperative to discuss the matter with Asian countries to collect the data on decomposition in a proper manner by establishing a body farm. We can also share the data, knowledge, and expertise to collaborate with one another to make such facilities better and have good scientific relations to promote science and explore ways of investigation at the world level.Keywords: body farm, rate of decomposition, forensically important flies, time since death
Procedia PDF Downloads 8623630 The Impact of Inflation Rate and Interest Rate on Islamic and Conventional Banking in Afghanistan
Authors: Tareq Nikzad
Abstract:
Since the first bank was established in 1933, Afghanistan's banking sector has seen a number of variations but hasn't been able to grow to its full potential because of the civil war. The implementation of dual banks in Afghanistan is investigated in this study in relation to the effects of inflation and interest rates. This research took data from World Bank Data (WBD) over a period of nineteen years. For the banking sector, inflation, which is the general rise in prices of goods and services over time, presents considerable difficulties. The objectives of this research are to analyze the effect of inflation and interest rates on conventional and Islamic banks in Afghanistan, identify potential differences between these two banking models, and provide insights for policymakers and practitioners. A mixed-methods approach is used in the research to analyze quantitative data and qualitatively examine the unique difficulties that banks in Afghanistan's economic atmosphere encounter. The findings contribute to the understanding of the relationship between interest rate, inflation rate, and the performance of both banking systems in Afghanistan. The paper concludes with recommendations for policymakers and banking institutions to enhance the stability and growth of the banking sector in Afghanistan. Interest is described as "a prefixed rate for use or borrowing of money" from an Islamic perspective. This "prefixed rate," known in Islamic economics as "riba," has been described as "something undesirable." Furthermore, by using the time series regression data technique on the annual data from 2003 to 2021, this research examines the effect of CPI inflation rate and interest rate of Banking in Afghanistan.Keywords: inflation, Islamic banking, conventional banking, interest, Afghanistan, impact
Procedia PDF Downloads 7123629 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria
Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur
Abstract:
The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria
Procedia PDF Downloads 37523628 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics
Authors: Janne Engblom, Elias Oikarinen
Abstract:
A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.Keywords: dynamic model, fixed effects, panel data, price dynamics
Procedia PDF Downloads 150423627 Blockchain-Based Assignment Management System
Authors: Amogh Katti, J. Sai Asritha, D. Nivedh, M. Kalyan Srinivas, B. Somnath Chakravarthi
Abstract:
Today's modern education system uses Learning Management System (LMS) portals for the scoring and grading of student performances, to maintain student records, and teachers are instructed to accept assignments through online submissions of .pdf,.doc,.ppt, etc. There is a risk of data tampering in the traditional portals; we will apply the Blockchain model instead of this traditional model to avoid data tampering and also provide a decentralized mechanism for overall fairness. Blockchain technology is a better and also recommended model because of the following features: consensus mechanism, decentralized system, cryptographic encryption, smart contracts, Ethereum blockchain. The proposed system ensures data integrity and tamper-proof assignment submission and grading, which will be helpful for both students and also educators.Keywords: education technology, learning management system, decentralized applications, blockchain
Procedia PDF Downloads 8123626 Trend Analysis of Africa’s Entrepreneurial Framework Conditions
Authors: Sheng-Hung Chen, Grace Mmametena Mahlangu, Hui-Cheng Wang
Abstract:
This study aims to explore the trends of the Entrepreneurial Framework Conditions (EFCs) in the five African regions. The Global Entrepreneur Monitor (GEM) is the primary source of data. The data drawn were organized into a panel (2000-2021) and obtained from the National Expert Survey (NES) databases as harmonized by the (GEM). The Methodology used is descriptive and uses mainly charts and tables; this is in line with the approach used by the GEM. The GEM draws its data from the National Expert Survey (NES). The survey by the NES is administered to experts in each country. The GEM collects entrepreneurship data specific to each country. It provides information about entrepreneurial ecosystems and their impact on entrepreneurship. The secondary source is from the literature review. This study focuses on the following GEM indicators: Financing for Entrepreneurs, Government support and Policies, Taxes and Bureaucracy, Government programs, Basic School Entrepreneurial Education and Training, Post school Entrepreneurial Education and Training, R&D Transfer, Commercial And Professional Infrastructure, Internal Market Dynamics, Internal Market Openness, Physical and Service Infrastructure, and Cultural And Social Norms, based on GEM Report 2020/21. The limitation of the study is the lack of updated data from some countries. Countries have to fund their own regional studies; African countries do not regularly participate due to a lack of resources.Keywords: trend analysis, entrepreneurial framework conditions (EFCs), African region, government programs
Procedia PDF Downloads 7023625 Access to Apprenticeships and the Impact of Individual and School Level Characteristics
Authors: Marianne Dæhlen
Abstract:
Periods of apprenticeships are characteristic of many vocational educational training (VET) systems. In many countries, becoming a skilled worker implies that the journey starts with an application for apprenticeships at a company or another relevant training establishment. In Norway, where this study is conducted, VET students start their journey with two years of school-based training before applying for two years of apprenticeship. Previous research has shown that access to apprenticeships differs by family background (socio-economic, immigrant, etc.), gender, school grades, and region. The question we raise in this study is whether the status, reputation, or position of the vocational school contributes to VET students’ access to apprenticeships. Data and methods: Register data containing information about schools’ and VET students’ characteristics will be analyzed in multilevel regression analyses. At the school level, the data will contain information on school size, shares of immigrants and/or share of male/female students, and grade requirements for admission. At the VET-student level, the register contains information on e.g., gender, school grades, educational program/trade, obtaining apprenticeship or not. The data set comprises about 3,000 students. Results: The register data is expected to be received in November 2024 and consequently, any results are not present at the point of this call. The planned article is part of a larger research project granted from the Norwegian Research Council and will, accordingly to the plan, start up in December 2024.Keywords: apprenticeships, VET-students’ characteristics, vocational schools, quantitative methods
Procedia PDF Downloads 823624 Data Acquisition System for Automotive Testing According to the European Directive 2004/104/EC
Authors: Herminio Martínez-García, Juan Gámiz, Yolanda Bolea, Antoni Grau
Abstract:
This article presents an interactive system for data acquisition in vehicle testing according to the test process defined in automotive directive 2004/104/EC. The project has been designed and developed by authors for the Spanish company Applus-LGAI. The developed project will result in a new process, which will involve the creation of braking cycle test defined in the aforementioned automotive directive. It will also allow the analysis of new vehicle features that was not feasible, allowing an increasing interaction with the vehicle. Potential users of this system in the short term will be vehicle manufacturers and in a medium term the system can be extended to testing other automotive components and EMC tests.Keywords: automotive process, data acquisition system, electromagnetic compatibility (EMC) testing, European Directive 2004/104/EC
Procedia PDF Downloads 33823623 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30223622 The Design of a Phase I/II Trial of Neoadjuvant RT with Interdigitated Multiple Fractions of Lattice RT for Large High-grade Soft-Tissue Sarcoma
Authors: Georges F. Hatoum, Thomas H. Temple, Silvio Garcia, Xiaodong Wu
Abstract:
Soft Tissue Sarcomas (STS) represent a diverse group of malignancies with heterogeneous clinical and pathological features. The treatment of extremity STS aims to achieve optimal local tumor control, improved survival, and preservation of limb function. The National Comprehensive Cancer Network guidelines, based on the cumulated clinical data, recommend radiation therapy (RT) in conjunction with limb-sparing surgery for large, high-grade STS measuring greater than 5 cm in size. Such treatment strategy can offer a cure for patients. However, when recurrence occurs (in nearly half of patients), the prognosis is poor, with a median survival of 12 to 15 months and with only palliative treatment options available. The spatially-fractionated-radiotherapy (SFRT), with a long history of treating bulky tumors as a non-mainstream technique, has gained new attention in recent years due to its unconventional therapeutic effects, such as bystander/abscopal effects. Combining single fraction of GRID, the original form of SFRT, with conventional RT was shown to have marginally increased the rate of pathological necrosis, which has been recognized to have a positive correlation to overall survival. In an effort to consistently increase the pathological necrosis rate over 90%, multiple fractions of Lattice RT (LRT), a newer form of 3D SFRT, interdigitated with the standard RT as neoadjuvant therapy was conducted in a preliminary clinical setting. With favorable results of over 95% of necrosis rate in a small cohort of patients, a Phase I/II clinical study was proposed to exam the safety and feasibility of this new strategy. Herein the design of the clinical study is presented. In this single-arm, two-stage phase I/II clinical trial, the primary objectives are >80% of the patients achieving >90% tumor necrosis and to evaluation the toxicity; the secondary objectives are to evaluate the local control, disease free survival and overall survival (OS), as well as the correlation between clinical response and the relevant biomarkers. The study plans to accrue patients over a span of two years. All patient will be treated with the new neoadjuvant RT regimen, in which one of every five fractions of conventional RT is replaced by a LRT fraction with vertices receiving dose ≥10Gy while keeping the tumor periphery at or close to 2 Gy per fraction. Surgical removal of the tumor is planned to occur 6 to 8 weeks following the completion of radiation therapy. The study will employ a Pocock-style early stopping boundary to ensure patient safety. The patients will be followed and monitored for a period of five years. Despite much effort, the rarity of the disease has resulted in limited novel therapeutic breakthroughs. Although a higher rate of treatment-induced tumor necrosis has been associated with improved OS, with the current techniques, only 20% of patients with large, high-grade tumors achieve a tumor necrosis rate exceeding 50%. If this new neoadjuvant strategy is proven effective, an appreciable improvement in clinical outcome without added toxicity can be anticipated. Due to the rarity of the disease, it is hoped that such study could be orchestrated in a multi-institutional setting.Keywords: lattice RT, necrosis, SFRT, soft tissue sarcoma
Procedia PDF Downloads 5823621 A Review of Spatial Analysis as a Geographic Information Management Tool
Authors: Chidiebere C. Agoha, Armstong C. Awuzie, Chukwuebuka N. Onwubuariri, Joy O. Njoku
Abstract:
Spatial analysis is a field of study that utilizes geographic or spatial information to understand and analyze patterns, relationships, and trends in data. It is characterized by the use of geographic or spatial information, which allows for the analysis of data in the context of its location and surroundings. It is different from non-spatial or aspatial techniques, which do not consider the geographic context and may not provide as complete of an understanding of the data. Spatial analysis is applied in a variety of fields, which includes urban planning, environmental science, geosciences, epidemiology, marketing, to gain insights and make decisions about complex spatial problems. This review paper explores definitions of spatial analysis from various sources, including examples of its application and different analysis techniques such as Buffer analysis, interpolation, and Kernel density analysis (multi-distance spatial cluster analysis). It also contrasts spatial analysis with non-spatial analysis.Keywords: aspatial technique, buffer analysis, epidemiology, interpolation
Procedia PDF Downloads 31523620 IoT Based Agriculture Monitoring Framework for Sustainable Rice Production
Authors: Armanul Hoque Shaon, Md Baizid Mahmud, Askander Nobi, Md. Raju Ahmed, Md. Jiabul Hoque
Abstract:
In the Internet of Things (IoT), devices are linked to the internet through a wireless network, allowing them to collect and transmit data without the need for a human operator. Agriculture relies heavily on wireless sensors, which are a vital component of the Internet of Things (IoT). This kind of wireless sensor network monitors physical or environmental variables like temperatures, sound, vibration, pressure, or motion without relying on a central location or sink and collaboratively passes its data across the network to be analyzed. As the primary source of plant nutrients, the soil is critical to the agricultural industry's continued growth. We're excited about the prospect of developing an Internet of Things (IoT) solution. To arrange the network, the sink node collects groundwater levels and sends them to the Gateway, which centralizes the data and forwards it to the sensor nodes. The sink node gathers soil moisture data, transmits the mean to the Gateways, and then forwards it to the website for dissemination. The web server is in charge of storing and presenting the moisture in the soil data to the web application's users. Soil characteristics may be collected using a networked method that we developed to improve rice production. Paddy land is running out as the population of our nation grows. The success of this project will be dependent on the appropriate use of the existing land base.Keywords: IoT based agriculture monitoring, intelligent irrigation, communicating network, rice production
Procedia PDF Downloads 15223619 Land Cover Classification Using Sentinel-2 Image Data and Random Forest Algorithm
Authors: Thanh Noi Phan, Martin Kappas, Jan Degener
Abstract:
The currently launched Sentinel 2 (S2) satellite (June, 2015) bring a great potential and opportunities for land use/cover map applications, due to its fine spatial resolution multispectral as well as high temporal resolutions. So far, there are handful studies using S2 real data for land cover classification. Especially in northern Vietnam, to our best knowledge, there exist no studies using S2 data for land cover map application. The aim of this study is to provide the preliminary result of land cover classification using Sentinel -2 data with a rising state – of – art classifier, Random Forest. A case study with heterogeneous land use/cover in the eastern of Hanoi Capital – Vietnam was chosen for this study. All 10 spectral bands of 10 and 20 m pixel size of S2 images were used, the 10 m bands were resampled to 20 m. Among several classified algorithms, supervised Random Forest classifier (RF) was applied because it was reported as one of the most accuracy methods of satellite image classification. The results showed that the red-edge and shortwave infrared (SWIR) bands play an important role in land cover classified results. A very high overall accuracy above 90% of classification results was achieved.Keywords: classify algorithm, classification, land cover, random forest, sentinel 2, Vietnam
Procedia PDF Downloads 38423618 Gene Expression Signature-Based Chemical Genomic to Identify Potential Therapeutic Compounds for Colorectal Cancer
Authors: Yen-Hao Su, Wan-Chun Tang, Ya-Wen Cheng, Peik Sia, Chi-Chen Huang, Yi-Chao Lee, Hsin-Yi Jiang, Ming-Heng Wu, I-Lu Lai, Jun-Wei Lee, Kuen-Haur Lee
Abstract:
There is a wide range of drugs and combinations under investigation and/or approved over the last decade to treat colorectal cancer (CRC), but the 5-year survival rate remains poor at stages II–IV. Therefore, new, more efficient drugs still need to be developed that will hopefully be included in first-line therapy or overcome resistance when it appears, as part of second- or third-line treatments in the near future. In this study, we revealed that heat shock protein 90 (Hsp90) inhibitors have high therapeutic potential in CRC according to combinative analysis of NCBI's Gene Expression Omnibus (GEO) repository and chemical genomic database of Connectivity Map (CMap). We found that second generation Hsp90 inhibitor, NVP-AUY922, significantly down regulated the activities of a broad spectrum of kinases involved in regulating cell growth arrest and death of NVPAUY922-sensitive CRC cells. To overcome NVP-AUY922-induced upregulation of survivin expression which causes drug insensitivity, we found that combining berberine (BBR), a herbal medicine with potency in inhibiting survivin expression, with NVP-AUY922 resulted in synergistic antiproliferative effects for NVP-AUY922-sensitive and -insensitive CRC cells. Furthermore, we demonstrated that treatment of NVP-AUY922-insensitive CRC cells with the combination of NVP-AUY922 and BBR caused cell growth arrest through inhibiting CDK4 expression and induction of microRNA-296-5p (miR-296-5p)-mediated suppression of Pin1–β-catenin–cyclin D1 signaling pathway. Finally, we found that the expression level of Hsp90 in tumor tissues of CRC was positively correlated with CDK4 and Pin1 expression levels. Taken together, these results indicate that combination of NVP-AUY922 and BBR therapy can inhibit multiple oncogenic signaling pathways of CRC.Keywords: berberine, colorectal cancer, connectivity map, heat shock protein 90 inhibitor
Procedia PDF Downloads 30323617 Constructing the Density of States from the Parallel Wang Landau Algorithm Overlapping Data
Authors: Arman S. Kussainov, Altynbek K. Beisekov
Abstract:
This work focuses on building an efficient universal procedure to construct a single density of states from the multiple pieces of data provided by the parallel implementation of the Wang Landau Monte Carlo based algorithm. The Ising and Pott models were used as the examples of the two-dimensional spin lattices to construct their densities of states. Sampled energy space was distributed between the individual walkers with certain overlaps. This was made to include the latest development of the algorithm as the density of states replica exchange technique. Several factors of immediate importance for the seamless stitching process have being considered. These include but not limited to the speed and universality of the initial parallel algorithm implementation as well as the data post-processing to produce the expected smooth density of states.Keywords: density of states, Monte Carlo, parallel algorithm, Wang Landau algorithm
Procedia PDF Downloads 410