Search results for: audiological data
23918 Road Condition Monitoring Using Built-in Vehicle Technology Data, Drones, and Deep Learning
Authors: Judith Mwakalonge, Geophrey Mbatta, Saidi Siuhi, Gurcan Comert, Cuthbert Ruseruka
Abstract:
Transportation agencies worldwide continuously monitor their roads' conditions to minimize road maintenance costs and maintain public safety and rideability quality. Existing methods for carrying out road condition surveys involve manual observations of roads using standard survey forms done by qualified road condition surveyors or engineers either on foot or by vehicle. Automated road condition survey vehicles exist; however, they are very expensive since they require special vehicles equipped with sensors for data collection together with data processing and computing devices. The manual methods are expensive, time-consuming, infrequent, and can hardly provide real-time information for road conditions. This study contributes to this arena by utilizing built-in vehicle technologies, drones, and deep learning to automate road condition surveys while using low-cost technology. A single model is trained to capture flexible pavement distresses (Potholes, Rutting, Cracking, and raveling), thereby providing a more cost-effective and efficient road condition monitoring approach that can also provide real-time road conditions. Additionally, data fusion is employed to enhance the road condition assessment with data from vehicles and drones.Keywords: road conditions, built-in vehicle technology, deep learning, drones
Procedia PDF Downloads 12723917 Enhancing Student Learning Outcomes Using Engineering Design Process: Case Study in Physics Course
Authors: Thien Van Ngo
Abstract:
The engineering design process is a systematic approach to solving problems. It involves identifying a problem, brainstorming solutions, prototyping and testing solutions, and evaluating the results. The engineering design process can be used to teach students how to solve problems in a creative and innovative way. The research aim of this study was to investigate the effectiveness of using the engineering design process to enhance student learning outcomes in a physics course. A mixed research method was used in this study. The quantitative data were collected using a pretest-posttest control group design. The qualitative data were collected using semi-structured interviews. The sample was 150 first-year students in the Department of Mechanical Engineering Technology at Cao Thang Technical College in Vietnam in the 2022-2023 school year. The quantitative data were collected using a pretest-posttest control group design. The pretest was administered to both groups at the beginning of the study. The posttest was administered to both groups at the end of the study. The qualitative data were collected using semi-structured interviews with a sample of eight students in the experimental group. The interviews were conducted after the posttest. The quantitative data were analyzed using independent sample T-tests. The qualitative data were analyzed using thematic analysis. The quantitative data showed that students in the experimental group, who were taught using the engineering design process, had significantly higher post-test scores on physics problem-solving than students in the control group, who were taught using the conventional method. The qualitative data showed that students in the experimental group were more motivated and engaged in the learning process than students in the control group. Students in the experimental group also reported that they found the engineering design process to be a more effective way of learning physics. The findings of this study suggest that the engineering design process can be an effective way of enhancing student learning outcomes in physics courses. The engineering design process engages students in the learning process and helps them to develop problem-solving skills.Keywords: engineering design process, problem-solving, learning outcome of physics, students’ physics competencies, deep learning
Procedia PDF Downloads 6623916 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank
Authors: Jalal Haghighat Monfared, Zahra Akbari
Abstract:
Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.Keywords: business intelligence, business intelligence capability, decision making, decision quality
Procedia PDF Downloads 11323915 Modelling of Geotechnical Data Using Geographic Information System and MATLAB for Eastern Ahmedabad City, Gujarat
Authors: Rahul Patel
Abstract:
Ahmedabad, a city located in western India, is experiencing rapid growth due to urbanization and industrialization. It is projected to become a metropolitan city in the near future, resulting in various construction activities. Soil testing is necessary before construction can commence, requiring construction companies and contractors to periodically conduct soil testing. The focus of this study is on the process of creating a spatial database that is digitally formatted and integrated with geotechnical data and a Geographic Information System (GIS). Building a comprehensive geotechnical (Geo)-database involves three steps: collecting borehole data from reputable sources, verifying the accuracy and redundancy of the data, and standardizing and organizing the geotechnical information for integration into the database. Once the database is complete, it is integrated with GIS, allowing users to visualize, analyze, and interpret geotechnical information spatially. Using a Topographic to Raster interpolation process in GIS, estimated values are assigned to all locations based on sampled geotechnical data values. The study area was contoured for SPT N-Values, Soil Classification, Φ-Values, and Bearing Capacity (T/m2). Various interpolation techniques were cross-validated to ensure information accuracy. This GIS map enables the calculation of SPT N-Values, Φ-Values, and bearing capacities for different footing widths and various depths. This study highlights the potential of GIS in providing an efficient solution to complex phenomena that would otherwise be tedious to achieve through other means. Not only does GIS offer greater accuracy, but it also generates valuable information that can be used as input for correlation analysis. Furthermore, this system serves as a decision support tool for geotechnical engineers.Keywords: ArcGIS, borehole data, geographic information system, geo-database, interpolation, SPT N-value, soil classification, Φ-Value, bearing capacity
Procedia PDF Downloads 7423914 Using TRACE and SNAP Codes to Establish the Model of Maanshan PWR for SBO Accident
Authors: B. R. Shen, J. R. Wang, J. H. Yang, S. W. Chen, C. Shih, Y. Chiang, Y. F. Chang, Y. H. Huang
Abstract:
In this research, TRACE code with the interface code-SNAP was used to simulate and analyze the SBO (station blackout) accident which occurred in Maanshan PWR (pressurized water reactor) nuclear power plant (NPP). There are four main steps in this research. First, the SBO accident data of Maanshan NPP were collected. Second, the TRACE/SNAP model of Maanshan NPP was established by using these data. Third, this TRACE/SNAP model was used to perform the simulation and analysis of SBO accident. Finally, the simulation and analysis of SBO with mitigation equipments was performed. The analysis results of TRACE are consistent with the data of Maanshan NPP. The mitigation equipments of Maanshan can maintain the safety of Maanshan in the SBO according to the TRACE predictions.Keywords: pressurized water reactor (PWR), TRACE, station blackout (SBO), Maanshan
Procedia PDF Downloads 19423913 A Comparative and Doctrinal Analysis towards the Investigation of a Right to Be Forgotten in Hong Kong
Authors: Jojo Y. C. Mo
Abstract:
Memories are good. They remind us of people, places and experiences that we cherish. But memories cannot be changed and there may well be memories that we do not want to remember. This is particularly true in relation to information which causes us embarrassment and humiliation or simply because it is private – we all want to erase or delete such information. This desire to delete is recently recognised by the Court of Justice of the European Union in the 2014 case of Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González in which the court ordered Google to remove links to some information about the complainant which he wished to be removed. This so-called ‘right to be forgotten’ received serious attention and significantly, the European Council and the European Parliament enacted the General Data Protection Regulation (GDPR) to provide a more structured and normative framework for implementation of right to be forgotten across the EU. This development in data protection laws will, undoubtedly, have significant impact on companies and co-operations not just within the EU but outside as well. Hong Kong, being one of the world’s leading financial and commercial center as well as one of the first jurisdictions in Asia to implement a comprehensive piece of data protection legislation, is therefore a jurisdiction that is worth looking into. This article/project aims to investigate the following: a) whether there is a right to be forgotten under the existing Hong Kong data protection legislation b) if not, whether such a provision is necessary and why. This article utilises a comparative methodology based on a study of primary and secondary resources, including scholarly articles, government and law commission reports and working papers and relevant international treaties, constitutional documents, case law and legislation. The author will primarily engage literature and case-law review as well as comparative and doctrinal analyses. The completion of this article will provide privacy researchers with more concrete principles and data to conduct further research on privacy and data protection in Hong Kong and internationally and will provide a basis for policy makers in assessing the rationale and need for a right to be forgotten in Hong Kong.Keywords: privacy, right to be forgotten, data protection, Hong Kong
Procedia PDF Downloads 19123912 Damage Assessment Based on Full-Polarimetric Decompositions in the 2017 Colombia Landslide
Authors: Hyeongju Jeon, Yonghyun Kim, Yongil Kim
Abstract:
Synthetic Aperture Radar (SAR) is an effective tool for damage assessment induced by disasters due to its all-weather and night/day acquisition capability. In this paper, the 2017 Colombia landslide was observed using full-polarimetric ALOS/PALSAR-2 data. Polarimetric decompositions, including the Freeman-Durden decomposition and the Cloude decomposition, are utilized to analyze the scattering mechanisms changes before and after-landslide. These analyses are used to detect the damaged areas induced by the landslide. Experimental results validate the efficiency of the full polarimetric SAR data since the damaged areas can be well discriminated. Thus, we can conclude the proposed method using full polarimetric data has great potential for damage assessment of landslides.Keywords: Synthetic Aperture Radar (SAR), polarimetric decomposition, damage assessment, landslide
Procedia PDF Downloads 39323911 Using Historical Data for Stock Prediction
Authors: Sofia Stoica
Abstract:
In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.Keywords: finance, machine learning, opening price, stock market
Procedia PDF Downloads 19623910 Supervised Learning for Cyber Threat Intelligence
Authors: Jihen Bennaceur, Wissem Zouaghi, Ali Mabrouk
Abstract:
The major aim of cyber threat intelligence (CTI) is to provide sophisticated knowledge about cybersecurity threats to ensure internal and external safeguards against modern cyberattacks. Inaccurate, incomplete, outdated, and invaluable threat intelligence is the main problem. Therefore, data analysis based on AI algorithms is one of the emergent solutions to overcome the threat of information-sharing issues. In this paper, we propose a supervised machine learning-based algorithm to improve threat information sharing by providing a sophisticated classification of cyber threats and data. Extensive simulations investigate the accuracy, precision, recall, f1-score, and support overall to validate the designed algorithm and to compare it with several supervised machine learning algorithms.Keywords: threat information sharing, supervised learning, data classification, performance evaluation
Procedia PDF Downloads 15023909 Methodologies, Findings, Discussion, and Limitations in Global, Multi-Lingual Research: We Are All Alone - Chinese Internet Drama
Authors: Patricia Portugal Marques de Carvalho Lourenco
Abstract:
A three-phase methodological multi-lingual path was designed, constructed and carried out using the 2020 Chinese Internet Drama Series We Are All Alone as a case study. Phase one, the backbone of the research, comprised of secondary data analysis, providing the structure on which the next two phases would be built on. Phase one incorporated a Google Scholar and a Baidu Index analysis, Star Network Influence Index and Mydramalist.com top two drama reviews, along with an article written about the drama and scrutiny of Chinese related blogs and websites. Phase two was field research elaborated across Latin Europe, and phase three was social media focused, having into account that perceptions are going to be memory conditioned based on past ideas recall. Overall, research has shown the poor cultural expression of Chinese entertainment in Latin Europe and demonstrated the inexistence of Chinese content in French, Italian, Portuguese and Spanish Business to Consumer retailers; a reflection of their low significance in Latin European markets and the short-life cycle of entertainment products in general, bubble-gum, disposable goods without a mid to long-term effect in consumers lives. The process of conducting comprehensive international research was complex and time-consuming, with data not always available in Mandarin, the researcher’s linguistic deficiency, limited Chinese Cultural Knowledge and cultural equivalence. Despite steps being taken to minimize the international proposed research, theoretical limitations concurrent to Latin Europe and China still occurred. Data accuracy was disputable; sampling, data collection/analysis methods are heterogeneous; ascertaining data requirements and the method of analysis to achieve a construct equivalence was challenging and morose to operationalize. Secondary data was also not often readily available in Mandarin; yet, in spite of the array of limitations, research was done, and results were produced.Keywords: research methodologies, international research, primary data, secondary data, research limitations, online dramas, china, latin europe
Procedia PDF Downloads 6823908 Node Insertion in Coalescence Hidden-Variable Fractal Interpolation Surface
Authors: Srijanani Anurag Prasad
Abstract:
The Coalescence Hidden-variable Fractal Interpolation Surface (CHFIS) was built by combining interpolation data from the Iterated Function System (IFS). The interpolation data in a CHFIS comprises a row and/or column of uncertain values when a single point is entered. Alternatively, a row and/or column of additional points are placed in the given interpolation data to demonstrate the node added CHFIS. There are three techniques for inserting new points that correspond to the row and/or column of nodes inserted, and each method is further classified into four types based on the values of the inserted nodes. As a result, numerous forms of node insertion can be found in a CHFIS.Keywords: fractal, interpolation, iterated function system, coalescence, node insertion, knot insertion
Procedia PDF Downloads 10123907 Optimizing the Efficiency of Measuring Instruments in Ouagadougou-Burkina Faso
Authors: Moses Emetere, Marvel Akinyemi, S. E. Sanni
Abstract:
At the moment, AERONET or AMMA database shows a large volume of data loss. With only about 47% data set available to the scientist, it is evident that accurate nowcast or forecast cannot be guaranteed. The calibration constants of most radiosonde or weather stations are not compatible with the atmospheric conditions of the West African climate. A dispersion model was developed to incorporate salient mathematical representations like a Unified number. The Unified number was derived to describe the turbulence of the aerosols transport in the frictional layer of the lower atmosphere. Fourteen years data set from Multi-angle Imaging SpectroRadiometer (MISR) was tested using the dispersion model. A yearly estimation of the atmospheric constants over Ouagadougou using the model was obtained with about 87.5% accuracy. It further revealed that the average atmospheric constant for Ouagadougou-Niger is a_1 = 0.626, a_2 = 0.7999 and the tuning constants is n_1 = 0.09835 and n_2 = 0.266. Also, the yearly atmospheric constants affirmed the lower atmosphere of Ouagadougou is very dynamic. Hence, it is recommended that radiosonde and weather station manufacturers should constantly review the atmospheric constant over a geographical location to enable about eighty percent data retrieval.Keywords: aerosols retention, aerosols loading, statistics, analytical technique
Procedia PDF Downloads 31523906 Modern Imputation Technique for Missing Data in Linear Functional Relationship Model
Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, Rahmatullah Imon
Abstract:
Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in the LFRM. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators
Procedia PDF Downloads 40023905 Imputing Missing Data in Electronic Health Records: A Comparison of Linear and Non-Linear Imputation Models
Authors: Alireza Vafaei Sadr, Vida Abedi, Jiang Li, Ramin Zand
Abstract:
Missing data is a common challenge in medical research and can lead to biased or incomplete results. When the data bias leaks into models, it further exacerbates health disparities; biased algorithms can lead to misclassification and reduced resource allocation and monitoring as part of prevention strategies for certain minorities and vulnerable segments of patient populations, which in turn further reduce data footprint from the same population – thus, a vicious cycle. This study compares the performance of six imputation techniques grouped into Linear and Non-Linear models on two different realworld electronic health records (EHRs) datasets, representing 17864 patient records. The mean absolute percentage error (MAPE) and root mean squared error (RMSE) are used as performance metrics, and the results show that the Linear models outperformed the Non-Linear models in terms of both metrics. These results suggest that sometimes Linear models might be an optimal choice for imputation in laboratory variables in terms of imputation efficiency and uncertainty of predicted values.Keywords: EHR, machine learning, imputation, laboratory variables, algorithmic bias
Procedia PDF Downloads 8523904 Shape Management Method of Large Structure Based on Octree Space Partitioning
Authors: Gichun Cha, Changgil Lee, Seunghee Park
Abstract:
The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning
Procedia PDF Downloads 29723903 Exploiting Kinetic and Kinematic Data to Plot Cyclograms for Managing the Rehabilitation Process of BKAs by Applying Neural Networks
Authors: L. Parisi
Abstract:
Kinematic data wisely correlate vector quantities in space to scalar parameters in time to assess the degree of symmetry between the intact limb and the amputated limb with respect to a normal model derived from the gait of control group participants. Furthermore, these particular data allow a doctor to preliminarily evaluate the usefulness of a certain rehabilitation therapy. Kinetic curves allow the analysis of ground reaction forces (GRFs) to assess the appropriateness of human motion. Electromyography (EMG) allows the analysis of the fundamental lower limb force contributions to quantify the level of gait asymmetry. However, the use of this technological tool is expensive and requires patient’s hospitalization. This research work suggests overcoming the above limitations by applying artificial neural networks.Keywords: kinetics, kinematics, cyclograms, neural networks, transtibial amputation
Procedia PDF Downloads 44623902 Urbanization and Built Environment: Impacts of Squatter Slums on Degeneration of Urban Built Environment, a Case Study of Karachi
Authors: Mansoor Imam, Amber Afshan, Sumbul Mujeeb, Kamran Gill
Abstract:
An investigative approach has been made to study the quality of living prevailing in the squatter slums of Karachi city that is influencing the urbanization trends and environmental degeneration of built environment. The paper identifies the issues and aspects that have directly and indirectly impacted the degeneration owing to inadequate basic infrastructural amenities, substandard housing, overcrowding, poor ventilation in homes and workplaces, and noncompliance with building bye-laws and regulations, etc. Primarily, secondary data has been critically examined and analyzed which was however not limited to census data, demographic / socioeconomic data, official documents and other relevant secondary data were obtained from existing literature and GIS. It is observed that the poor and sub-standard housing / living quality have serious adverse impacts on the environment and the health of city residents. Hence strategies for improving the quality of built environment for sustainable living are mandated. It is, therefore, imperative to check and prevent further degradation and promote harmonious living and sustainable urbanization.Keywords: squatter slums, urbanization, degenerations, living quality, built environment
Procedia PDF Downloads 39423901 Assessment of the Contribution of Geographic Information System Technology in Non Revenue Water: Case Study Dar Es Salaam Water and Sewerage Authority Kawe - Mzimuni Street
Authors: Victor Pesco Kassa
Abstract:
This research deals with the assessment of the contribution of GIS Technology in NRW. This research was conducted at Dar, Kawe Mzimuni Street. The data collection was obtained from existing source which is DAWASA HQ. The interpretation of the data was processed by using ArcGIS software. The data collected from the existing source reveals a good coverage of DAWASA’s water network at Mzimuni Street. Most of residents are connected to the DAWASA’s customer service. Also the collected data revealed that by using GIS DAWASA’s customer Geodatabase has been improved. Through GIS we can prepare customer location map purposely for site surveying also this map will be able to show different type of customer that are connected to DAWASA’s water service. This is a perfect contribution of the GIS Technology to address and manage the problem of NRW in DAWASA. Finally, the study recommends that the same study should be conducted in other DAWASA’s zones such as Temeke, Boko and Bagamoyo not only at Kawe Mzimuni Street. Through this study it is observed that ArcGIS software can offer powerful tools for managing and processing information geographically and in water and sanitation authorities such as DAWASA.Keywords: DAWASA, NRW, Esri, EURA, ArcGIS
Procedia PDF Downloads 8323900 Robustified Asymmetric Logistic Regression Model for Global Fish Stock Assessment
Authors: Osamu Komori, Shinto Eguchi, Hiroshi Okamura, Momoko Ichinokawa
Abstract:
The long time-series data on population assessments are essential for global ecosystem assessment because the temporal change of biomass in such a database reflects the status of global ecosystem properly. However, the available assessment data usually have limited sample sizes and the ratio of populations with low abundance of biomass (collapsed) to those with high abundance (non-collapsed) is highly imbalanced. To allow for the imbalance and uncertainty involved in the ecological data, we propose a binary regression model with mixed effects for inferring ecosystem status through an asymmetric logistic model. In the estimation equation, we observe that the weights for the non-collapsed populations are relatively reduced, which in turn puts more importance on the small number of observations of collapsed populations. Moreover, we extend the asymmetric logistic regression model using propensity score to allow for the sample biases observed in the labeled and unlabeled datasets. It robustified the estimation procedure and improved the model fitting.Keywords: double robust estimation, ecological binary data, mixed effect logistic regression model, propensity score
Procedia PDF Downloads 26823899 Research on Hangzhou Commercial Center System Based on Point of Interest Data
Authors: Chen Wang, Qiuxiao Chen
Abstract:
With the advent of the information age and the era of big data, urban planning research is no longer satisfied with the analysis and application of traditional data. Because of the limitations of traditional urban commercial center system research, big data provides new opportunities for urban research. Therefore, based on the quantitative evaluation method of big data, the commercial center system of the main city of Hangzhou is analyzed and evaluated, and the scale and hierarchical structure characteristics of the urban commercial center system are studied. In order to make up for the shortcomings of the existing POI extraction method, it proposes a POI extraction method based on adaptive adjustment of search window, which can accurately and efficiently extract the POI data of commercial business in the main city of Hangzhou. Through the visualization and nuclear density analysis of the extracted Point of Interest (POI) data, the current situation of the commercial center system in the main city of Hangzhou is evaluated. Then it compares with the commercial center system structure of 'Hangzhou City Master Plan (2001-2020)', analyzes the problems existing in the planned urban commercial center system, and provides corresponding suggestions and optimization strategy for the optimization of the planning of Hangzhou commercial center system. Then get the following conclusions: The status quo of the commercial center system in the main city of Hangzhou presents a first-level main center, a two-level main center, three third-level sub-centers, and multiple community-level business centers. Generally speaking, the construction of the main center in the commercial center system is basically up to standard, and there is still a big gap in the construction of the sub-center and the regional-level commercial center, further construction is needed. Therefore, it proposes an optimized hierarchical functional system, organizes commercial centers in an orderly manner; strengthens the central radiation to drive surrounding areas; implements the construction guidance of the center, effectively promotes the development of group formation and further improves the commercial center system structure of the main city of Hangzhou.Keywords: business center system, business format, main city of Hangzhou, POI extraction method
Procedia PDF Downloads 14023898 Stakeholder Analysis of Agricultural Drone Policy: A Case Study of the Agricultural Drone Ecosystem of Thailand
Authors: Thanomsin Chakreeves, Atichat Preittigun, Ajchara Phu-ang
Abstract:
This paper presents a stakeholder analysis of agricultural drone policies that meet the government's goal of building an agricultural drone ecosystem in Thailand. Firstly, case studies from other countries are reviewed. The stakeholder analysis method and qualitative data from the interviews are then presented including data from the Institute of Innovation and Management, the Office of National Higher Education Science Research and Innovation Policy Council, agricultural entrepreneurs and farmers. Study and interview data are then employed to describe the current ecosystem and to guide the implementation of agricultural drone policies that are suitable for the ecosystem of Thailand. Finally, policy recommendations are then made that the Thai government should adopt in the future.Keywords: drone public policy, drone ecosystem, policy development, agricultural drone
Procedia PDF Downloads 14923897 Study and Analysis of Optical Intersatellite Links
Authors: Boudene Maamar, Xu Mai
Abstract:
Optical Intersatellite Links (OISLs) are wireless communications using optical signals to interconnect satellites. It is expected to be the next generation wireless communication technology according to its inherent characteristics like: an increased bandwidth, a high data rate, a data transmission security, an immunity to interference, and an unregulated spectrum etc. Optical space links are the best choice for the classical communication schemes due to its distinctive properties; high frequency, small antenna diameter and lowest transmitted power, which are critical factors to define a space communication. This paper discusses the development of free space technology and analyses the parameters and factors to establish a reliable intersatellite links using an optical signal to exchange data between satellites.Keywords: optical intersatellite links, optical wireless communications, free space optical communications, next generation wireless communication
Procedia PDF Downloads 44723896 Sunshine Hour as a Factor to Maintain the Circadian Rhythm of Heart Rate: Analysis of Ambulatory ECG and Weather Big Data
Authors: Emi Yuda, Yutaka Yoshida, Junichiro Hayano
Abstract:
Distinct circadian rhythm of activity, i.e., high activity during the day and deep rest at night are a typical feature of a healthy lifestyle. Exposure to the skylight is thought to be an important factor to increase arousal level and maintain normal circadian rhythm. To examine whether sunshine hours influence the day-night contract of activity, we analyzed the relationship between 24-hour heart rate (HR) and weather data of the recording day. We analyzed data in 36,500 males and 49,854 females of Allostatic State Mapping by Ambulatory ECG Repository (ALLSTAR) database in Japan. Median (IQR) sunshine duration was 5.3 (2.8-7.9) hr. While sunshine hours had only modest effects of increasing 24-hour average HR in either gender (P=0.0282 and 0.0248 for male and female) and no significant effects on nighttime HR in either gender, it increased daytime HR (P = 0.0007 and 0.0015) and day-night HF difference in both genders (P < 0.0001 for both) even after adjusting for the effects of average temperature, atmospheric pressure, and humidity. Our observations support for the hypothesis that longer sunshine hours enhance circadian rhythm of activity.Keywords: big data, circadian rhythm, heart rate, sunshine
Procedia PDF Downloads 16523895 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector
Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau
Abstract:
Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement
Procedia PDF Downloads 19823894 Hybridized Approach for Distance Estimation Using K-Means Clustering
Authors: Ritu Vashistha, Jitender Kumar
Abstract:
Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.Keywords: ant colony optimization, data clustering, centroids, data mining, k-means
Procedia PDF Downloads 12823893 Digital Twin for University Campus: Workflow, Applications and Benefits
Authors: Frederico Fialho Teixeira, Islam Mashaly, Maryam Shafiei, Jurij Karlovsek
Abstract:
The ubiquity of data gathering and smart technologies, advancements in virtual technologies, and the development of the internet of things (IoT) have created urgent demands for the development of frameworks and efficient workflows for data collection, visualisation, and analysis. Digital twin, in different scales of the city into the building, allows for bringing together data from different sources to generate fundamental and illuminating insights for the management of current facilities and the lifecycle of amenities as well as improvement of the performance of current and future designs. Over the past two decades, there has been growing interest in the topic of digital twin and their applications in city and building scales. Most such studies look at the urban environment through a homogeneous or generalist lens and lack specificity in particular characteristics or identities, which define an urban university campus. Bridging this knowledge gap, this paper offers a framework for developing a digital twin for a university campus that, with some modifications, could provide insights for any large-scale digital twin settings like towns and cities. It showcases how currently unused data could be purposefully combined, interpolated and visualised for producing analysis-ready data (such as flood or energy simulations or functional and occupancy maps), highlighting the potential applications of such a framework for campus planning and policymaking. The research integrates campus-level data layers into one spatial information repository and casts light on critical data clusters for the digital twin at the campus level. The paper also seeks to raise insightful and directive questions on how digital twin for campus can be extrapolated to city-scale digital twin. The outcomes of the paper, thus, inform future projects for the development of large-scale digital twin as well as urban and architectural researchers on potential applications of digital twin in future design, management, and sustainable planning, to predict problems, calculate risks, decrease management costs, and improve performance.Keywords: digital twin, smart campus, framework, data collection, point cloud
Procedia PDF Downloads 7023892 Impact of Job Burnout on Job Satisfaction and Job Performance of Front Line Employees in Bank: Moderating Role of Hope and Self-Efficacy
Authors: Huma Khan, Faiza Akhtar
Abstract:
The present study investigates the effects of burnout toward job performance and job satisfaction with the moderating role of hope and self-efficacy. Findings from 310 frontline employees of Pakistani commercial banks (Lahore, Karachi & Islamabad) disclosed burnout has negative significant effects on job performance and job satisfaction. Simple random sampling technique was used to collect data and inferential statistics were applied to analyzed the data. However, results disclosed no moderation effect of hope on burnout, job performance or with job satisfaction. Moreover, Data significantly supported the moderation effect of self-efficacy. Study further shed light on the development of psychological capital. Importance of the implication of the current finding is discussed.Keywords: burnout, hope, job performance, job satisfaction, psychological capital, self-efficacy
Procedia PDF Downloads 14223891 Obstacle Classification Method Based on 2D LIDAR Database
Authors: Moohyun Lee, Soojung Hur, Yongwan Park
Abstract:
In this paper is proposed a method uses only LIDAR system to classification an obstacle and determine its type by establishing database for classifying obstacles based on LIDAR. The existing LIDAR system, in determining the recognition of obstruction in an autonomous vehicle, has an advantage in terms of accuracy and shorter recognition time. However, it was difficult to determine the type of obstacle and therefore accurate path planning based on the type of obstacle was not possible. In order to overcome this problem, a method of classifying obstacle type based on existing LIDAR and using the width of obstacle materials was proposed. However, width measurement was not sufficient to improve accuracy. In this research, the width data was used to do the first classification; database for LIDAR intensity data by four major obstacle materials on the road were created; comparison is made to the LIDAR intensity data of actual obstacle materials; and determine the obstacle type by finding the one with highest similarity values. An experiment using an actual autonomous vehicle under real environment shows that data declined in quality in comparison to 3D LIDAR and it was possible to classify obstacle materials using 2D LIDAR.Keywords: obstacle, classification, database, LIDAR, segmentation, intensity
Procedia PDF Downloads 35223890 Body Farming in India and Asia
Authors: Yogesh Kumar, Adarsh Kumar
Abstract:
A body farm is a research facility where research is done on forensic investigation and medico-legal disciplines like forensic entomology, forensic pathology, forensic anthropology, forensic archaeology, and related areas of forensic veterinary. All the research is done to collect data on the rate of decomposition (animal and human) and forensically important insects to assist in crime detection. The data collected is used by forensic pathologists, forensic experts, and other experts for the investigation of crime cases and further research. The research work includes different conditions of a dead body like fresh, bloating, decay, dry, and skeleton, and data on local insects which depends on the climatic conditions of the local areas of that country. Therefore, it is the need of time to collect appropriate data in managed conditions with a proper set-up in every country. Hence, it is the duty of the scientific community of every country to establish/propose such facilities for justice and social management. The body farms are also used for training of police, military, investigative dogs, and other agencies. At present, only four countries viz. U.S., Australia, Canada, and Netherlands have body farms and related facilities in organised manner. There is no body farm in Asia also. In India, we have been trying to establish a body farm in A&N Islands that is near Singapore, Malaysia, and some other Asian countries. In view of the above, it becomes imperative to discuss the matter with Asian countries to collect the data on decomposition in a proper manner by establishing a body farm. We can also share the data, knowledge, and expertise to collaborate with one another to make such facilities better and have good scientific relations to promote science and explore ways of investigation at the world level.Keywords: body farm, rate of decomposition, forensically important flies, time since death
Procedia PDF Downloads 8823889 The Impact of Inflation Rate and Interest Rate on Islamic and Conventional Banking in Afghanistan
Authors: Tareq Nikzad
Abstract:
Since the first bank was established in 1933, Afghanistan's banking sector has seen a number of variations but hasn't been able to grow to its full potential because of the civil war. The implementation of dual banks in Afghanistan is investigated in this study in relation to the effects of inflation and interest rates. This research took data from World Bank Data (WBD) over a period of nineteen years. For the banking sector, inflation, which is the general rise in prices of goods and services over time, presents considerable difficulties. The objectives of this research are to analyze the effect of inflation and interest rates on conventional and Islamic banks in Afghanistan, identify potential differences between these two banking models, and provide insights for policymakers and practitioners. A mixed-methods approach is used in the research to analyze quantitative data and qualitatively examine the unique difficulties that banks in Afghanistan's economic atmosphere encounter. The findings contribute to the understanding of the relationship between interest rate, inflation rate, and the performance of both banking systems in Afghanistan. The paper concludes with recommendations for policymakers and banking institutions to enhance the stability and growth of the banking sector in Afghanistan. Interest is described as "a prefixed rate for use or borrowing of money" from an Islamic perspective. This "prefixed rate," known in Islamic economics as "riba," has been described as "something undesirable." Furthermore, by using the time series regression data technique on the annual data from 2003 to 2021, this research examines the effect of CPI inflation rate and interest rate of Banking in Afghanistan.Keywords: inflation, Islamic banking, conventional banking, interest, Afghanistan, impact
Procedia PDF Downloads 72