Search results for: cloud data privacy and integrity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25493

Search results for: cloud data privacy and integrity

21953 Exploring Disruptive Innovation Capacity Effects on Firm Performance: An Investigation in Industries 4.0

Authors: Selma R. Oliveira, E. W. Cazarini

Abstract:

Recently, studies have referenced innovation as a key factor affecting the performance of firms. Companies make use of its innovative capacities to achieve sustainable competitive advantage. In this perspective, the objective of this paper is to contribute to innovation planning policies in industry 4.0. Thus, this paper examines the disruptive innovation capacity on firm performance in Europe. This procedure was prepared according to the following phases: Phase 1: Determination of the conceptual model; and Phase 2: Verification of the conceptual model. The research was initially conducted based on the specialized literature, which extracted the data regarding the constructs/structure and content in order to build the model. The research involved the intervention of experts knowledgeable on the object studied, selected by technical-scientific criteria. The data were extracted using an assessment matrix. To reduce subjectivity in the results achieved the following methods were used complementarily and in combination: multicriteria analysis, multivariate analysis, psychometric scaling and neurofuzzy technology. The data were extracted using an assessment matrix and the results were satisfactory, validating the modeling approach.

Keywords: disruptive innovation, capacity, performance, Industry 4.0

Procedia PDF Downloads 153
21952 Determining the Extent and Direction of Relief Transformations Caused by Ski Run Construction Using LIDAR Data

Authors: Joanna Fidelus-Orzechowska, Dominika Wronska-Walach, Jaroslaw Cebulski

Abstract:

Mountain areas are very often exposed to numerous transformations connected with the development of tourist infrastructure. In mountain areas in Poland ski tourism is very popular, so agricultural areas are often transformed into tourist areas. The construction of new ski runs can change the direction and rate of slope development. The main aim of this research was to determine geomorphological and hydrological changes within slopes caused by ski run constructions. The study was conducted in the Remiaszów catchment in the Inner Polish Carpathians (southern Poland). The mean elevation of the catchment is 859 m a.s.l. and the maximum is 946 m a.s.l. The surface area of the catchment is 1.16 km2, of which 16.8% is the area of the two studied ski runs. The studied ski runs were constructed in 2014 and 2015. In order to determine the relief transformations connected with new ski run construction high resolution LIDAR data was analyzed. The general relief changes in the studied catchment were determined on the basis of ALS (Airborne Laser Scanning ) data obtained before (2013) and after (2016) ski run construction. Based on the two sets of ALS data a digital elevation models of differences (DoDs) was created, which made it possible to determine the quantitative relief changes in the entire studied catchment. Additionally, cross and longitudinal profiles were calculated within slopes where new ski runs were built. Detailed data on relief changes within selected test surfaces was obtained based on TLS (Terrestrial Laser Scanning). Hydrological changes within the analyzed catchment were determined based on the convergence and divergence index. The study shows that the construction of the new ski runs caused significant geomorphological and hydrological changes in the entire studied catchment. However, the most important changes were identified within the ski slopes. After the construction of ski runs the entire catchment area lowered about 0.02 m. Hydrological changes in the studied catchment mainly led to the interruption of surface runoff pathways and changes in runoff direction and geometry.

Keywords: hydrological changes, mountain areas, relief transformations, ski run construction

Procedia PDF Downloads 140
21951 Net Fee and Commission Income Determinants of European Cooperative Banks

Authors: Karolína Vozková, Matěj Kuc

Abstract:

Net fee and commission income is one of the key elements of a bank’s core income. In the current low-interest rate environment, this type of income is gaining importance relative to net interest income. This paper analyses the effects of bank and country specific determinants of net fee and commission income on a set of cooperative banks from European countries in the 2007-2014 period. In order to do that, dynamic panel data methods (system Generalized Methods of Moments) were employed. Subsequently, alternative panel data methods were run as robustness checks of the analysis. Strong positive impact of bank concentration on the share of net fee and commission income was found, which proves that cooperative banks tend to display a higher share of fee income in less competitive markets. This is probably connected with the fact that they stick with their traditional deposit-taking and loan-providing model and fees on these services are driven down by the competitors. Moreover, compared to commercial banks, cooperatives do not expand heavily into non-traditional fee bearing services under competition and their overall fee income share is therefore decreasing with the increased competitiveness of the sector.

Keywords: cooperative banking, dynamic panel data models, net fee and commission income, system GMM

Procedia PDF Downloads 313
21950 Feature Weighting Comparison Based on Clustering Centers in the Detection of Diabetic Retinopathy

Authors: Kemal Polat

Abstract:

In this paper, three feature weighting methods have been used to improve the classification performance of diabetic retinopathy (DR). To classify the diabetic retinopathy, features extracted from the output of several retinal image processing algorithms, such as image-level, lesion-specific and anatomical components, have been used and fed them into the classifier algorithms. The dataset used in this study has been taken from University of California, Irvine (UCI) machine learning repository. Feature weighting methods including the fuzzy c-means clustering based feature weighting, subtractive clustering based feature weighting, and Gaussian mixture clustering based feature weighting, have been used and compered with each other in the classification of DR. After feature weighting, five different classifier algorithms comprising multi-layer perceptron (MLP), k- nearest neighbor (k-NN), decision tree, support vector machine (SVM), and Naïve Bayes have been used. The hybrid method based on combination of subtractive clustering based feature weighting and decision tree classifier has been obtained the classification accuracy of 100% in the screening of DR. These results have demonstrated that the proposed hybrid scheme is very promising in the medical data set classification.

Keywords: machine learning, data weighting, classification, data mining

Procedia PDF Downloads 318
21949 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter

Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri

Abstract:

Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.

Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion

Procedia PDF Downloads 688
21948 The On-Board Critical Message Transmission Design for Navigation Satellite Delay/Disruption Tolerant Network

Authors: Ji-yang Yu, Dan Huang, Guo-ping Feng, Xin Li, Lu-yuan Wang

Abstract:

The navigation satellite network, especially the Beidou MEO Constellation, can relay data effectively with wide coverage and is applied in navigation, detection, and position widely. But the constellation has not been completed, and the amount of satellites on-board is not enough to cover the earth, which makes the data-relay disrupted or delayed in the transition process. The data-relay function needs to tolerant the delay or disruption in some extension, which make the Beidou MEO Constellation a delay/disruption-tolerant network (DTN). The traditional DTN designs mainly employ the relay table as the basic of data path schedule computing. But in practical application, especially in critical condition, such as the war-time or the infliction heavy losses on the constellation, parts of the nodes may become invalid, then the traditional DTN design could be useless. Furthermore, when transmitting the critical message in the navigation system, the maximum priority strategy is used, but the nodes still inquiry the relay table to design the path, which makes the delay more than minutes. Under this circumstances, it needs a function which could compute the optimum data path on-board in real-time according to the constellation states. The on-board critical message transmission design for navigation satellite delay/disruption-tolerant network (DTN) is proposed, according to the characteristics of navigation satellite network. With the real-time computation of parameters in the network link, the least-delay transition path is deduced to retransmit the critical message in urgent conditions. First, the DTN model for constellation is established based on the time-varying matrix (TVM) instead of the time-varying graph (TVG); then, the least transition delay data path is deduced with the parameters of the current node; at last, the critical message transits to the next best node. For the on-board real-time computing, the time delay and misjudges of constellation states in ground stations are eliminated, and the residual information channel for each node can be used flexibly. Compare with the minute’s delay of traditional DTN; the proposed transmits the critical message in seconds, which improves the re-transition efficiency. The hardware is implemented in FPGA based on the proposed model, and the tests prove the validity.

Keywords: critical message, DTN, navigation satellite, on-board, real-time

Procedia PDF Downloads 335
21947 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 251
21946 Electricity Load Modeling: An Application to Italian Market

Authors: Giovanni Masala, Stefania Marica

Abstract:

Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.

Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression

Procedia PDF Downloads 389
21945 Pyridine-N-oxide Based AIE-active Triazoles: Synthesis, Morphology and Photophysical Properties

Authors: Luminita Marin, Dalila Belei, Carmen Dumea

Abstract:

Aggregation induced emission (AIE) is an intriguing optical phenomenon recently evidenced by Tang and his co-workers, for which aggregation works constructively in the improving of light emission. The AIE challenging phenomenon is quite opposite to the notorious aggregation caused quenching (ACQ) of light emission in the condensed phase, and comes in line with requirements of photonic and optoelectronic devices which need solid state emissive substrates. This paper reports a series of ten new aggregation induced emission (AIE) low molecular weight compounds based on triazole and pyridine-N-oxide heterocyclic units bonded by short flexible chains, obtained by a „click” chemistry reaction. The compounds present extremely weak luminescence in solution but strong light emission in solid state. To distinguish the influence of the crystallinity degree on the emission efficiency, the photophysical properties were explored by UV-vis and photoluminescence spectroscopy in solution, water suspension, amorphous and crystalline films. On the other hand, the compound morphology of the up mentioned states was monitored by dynamic light scattering, scanning electron microscopy, atomic force microscopy and polarized light microscopy methods. To further understand the structural design – photophysical properties relationship, single crystal X-ray diffraction on some understudy compounds was performed too. The UV-vis absorption spectra of the triazole water suspensions indicated a typical behaviour for nanoparticle formation, while the photoluminescence spectra revealed an emission intensity enhancement up to 921-fold higher of the crystalline films compared to solutions, clearly indicating an AIE behaviour. The compounds have the tendency to aggregate forming nano- and micro- crystals in shape of rose-like and fibres. The crystals integrity is kept due to the strong lateral intermolecular forces, while the absence of face-to-face forces explains the enhanced luminescence in crystalline state, in which the intramolecular rotations are restricted. The studied flexible triazoles draw attention to a new structural design in which small biologically friendly luminophore units are linked together by small flexible chains. This design enlarges the variety of the AIE luminogens to the flexible molecules, guiding further efforts in development of new AIE structures for appropriate applications, the biological ones being especially envisaged.

Keywords: aggregation induced emission, pyridine-N-oxide, triazole

Procedia PDF Downloads 447
21944 Factors Affecting Profitability of Pharmaceutical Company During the COVID-19 Pandemic: An Indonesian Evidence

Authors: Septiany Trisnaningtyas

Abstract:

Purpose: This research aims to examine the factors affecting the profitability of pharmaceutical company during the Covid-19 Pandemic in Indonesia. A sharp decline in the number of patients coming to the hospital for treatment during the pandemic has an impact on the growth of the pharmaceutical sector and brought major changes in financial position and business performance. Pharmaceutical companies that provide products related to the Covid-19 pandemic can survive and continue to grow. This study investigates the factors affecting the profitability of pharmaceutical company during the Covid-19 Pandemic in Indonesia associated with the number of Covid-19 cases. Design/methodology/approach: This study uses panel-data regression models to evaluate the influence of the number of Covid-19 confirmed cases on profitability of ninelisted pharmaceuticalcompanies in Indonesia. This research is based on four independent variables that were empirically examined for their relationship with profitability. These variables are liquidity (current ratio), growth rate (sales growth), firm size (total sales), and market power (the Lerner index). Covid-19 case is used as moderating variable. Data of nine pharmaceutical companies listed on the Indonesia Stock Exchange covering the period of 2018–2021 were extracted from companies’ quarterly annual reports. Findings: In the period during Covid-19, company growth (sales growth) and market power (lerner index) have a positive and significant relationship to ROA and ROE. Total of confirmed Covid-19 cases has a positive and significant relationship to ROA and is proven to have a moderating effect between company’s growth (sales growth) to ROA and ROE and market power (Lerner index) to ROA. Research limitations/implications: Due to data availability, this study only includes data from nine listed pharmaceutical companies in Indonesian Stock exchange and quarterly annual reportscovering the period of 2018-2021. Originality/value: This study focuses onpharmaceutical companies in Indonesia during Covid-19 pandemic. Previous study analyzes the data from pharmaceutical companies’ annual reports since 2014 and focus on universal health coverage (national health insurance) implementation from the Indonesian government. This study analyzes the data using fixed effect panel-data regression models to evaluate the influence of Covid-19 confirmed cases on profitability. Pooled ordinary least squares regression and fixed effects were used to analyze the data in previous study. This study also investigate the moderating effect of Covid-19 confirmed cases to profitability in relevant with the pandemic situation.

Keywords: profitability, indonesia, pharmaceutical, Covid-19

Procedia PDF Downloads 110
21943 A Systematic Review of the Methodological and Reporting Quality of Case Series in Surgery

Authors: Riaz A. Agha, Alexander J. Fowler, Seon-Young Lee, Buket Gundogan, Katharine Whitehurst, Harkiran K. Sagoo, Kyung Jin Lee Jeong, Douglas G. Altman, Dennis P. Orgill

Abstract:

Introduction: Case Series are an important and common study type. Currently, no guideline exists for reporting case series and there is evidence of key data being missed from such reports. We propose to develop a reporting guideline for case series using a methodologically robust technique. The first step in this process is a systematic review of literature relevant to the reporting deficiencies of case series. Methods: A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, EMBASE, Cochrane Methods Register, Science Citation index and Conference Proceedings Citation index, from the start of indexing until 5th November 2014. Independent screening, eligibility assessments and data extraction was performed. Included articles were analyzed for five areas of deficiency: failure to use standardized definitions missing or selective data transparency or incomplete reporting whether alternate study designs were considered. Results: The database searching identified 2,205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequency of methodological and reporting issues identified was a failure to use standardized definitions (57%), missing or selective data (66%), transparency, or incomplete reporting (70%), whether alternate study designs were considered (11%) and other issues (52%). Conclusion: The methodological and reporting quality of surgical case series needs improvement. Our data shows that clear evidence-based guidelines for the conduct and reporting of a case series may be useful to those planning or conducting them.

Keywords: case series, reporting quality, surgery, systematic review

Procedia PDF Downloads 351
21942 Model-Free Distributed Control of Dynamical Systems

Authors: Javad Khazaei, Rick Blum

Abstract:

Distributed control is an efficient and flexible approach for coordination of multi-agent systems. One of the main challenges in designing a distributed controller is identifying the governing dynamics of the dynamical systems. Data-driven system identification is currently undergoing a revolution. With the availability of high-fidelity measurements and historical data, model-free identification of dynamical systems can facilitate the control design without tedious modeling of high-dimensional and/or nonlinear systems. This paper develops a distributed control design using consensus theory for linear and nonlinear dynamical systems using sparse identification of system dynamics. Compared with existing consensus designs that heavily rely on knowing the detailed system dynamics, the proposed model-free design can accurately capture the dynamics of the system with available measurements and input data and provide guaranteed performance in consensus and tracking problems. Heterogeneous damped oscillators are chosen as examples of dynamical system for validation purposes.

Keywords: consensus tracking, distributed control, model-free control, sparse identification of dynamical systems

Procedia PDF Downloads 253
21941 Geospatial Assessment of Waste Disposal System in Akure, Ondo State, Nigeria

Authors: Babawale Akin Adeyemi, Esan Temitayo, Adeyemi Olabisi Omowumi

Abstract:

The paper analyzed waste disposal system in Akure, Ondo State using GIS techniques. Specifically, the study identified the spatial distribution of collection points and existing dumpsite; evaluated the accessibility of waste collection points and their proximity to each other with the view of enhancing better performance of the waste disposal system. Data for the study were obtained from both primary and secondary sources. Primary data were obtained through the administration of questionnaire. From field survey, 35 collection points were identified in the study area. 10 questionnaires were administered around each collection point making a total of 350 questionnaires for the study. Also, co-ordinates of each collection point were captured using a hand-held Global Positioning System (GPS) receiver which was used to analyze the spatial distribution of collection points. Secondary data used include administrative map collected from Akure South Local Government Secretariat. Data collected was analyzed using the GIS analytical tools which is neighborhood function. The result revealed that collection points were found in all parts of Akure with the highest concentration around the central business district. The study also showed that 80% of the collection points enjoyed efficient waste service while the remaining 20% does not. The study further revealed that most collection points in the core of the city were in close proximity to each other. In conclusion, the paper revealed the capability of Geographic Information System (GIS) as a technique in management of waste collection and disposal technique. The application of Geographic Information System (GIS) in the evaluation of the solid waste management in Akure is highly invaluable for the state waste management board which could also be beneficial to other states in developing a modern day solid waste management system. Further study on solid waste management is also recommended especially for updating of information on both spatial and non-spatial data.

Keywords: assessment, geospatial, system, waste disposal

Procedia PDF Downloads 231
21940 Students' Perspectives on Quality of Course Evaluation Practices and Feedbacks in Eritrea

Authors: Ermias Melake Tesfay

Abstract:

The importance of evaluation practice and feedback to student advancement and retention has gained importance in the literature over the past ten years. So many issues and cases have been raised about the quality and types of evaluation carried out in higher education and the quality and quantity of student feedback. The aim of this study was to explore the students’ perspectives on the quality of course evaluation practice and feedback in College of Education and College of Science. The study used both quantitative and qualitative methods to collect data. Data were collected from third-year and fourth-year students of 13 departments in the College of Education and College of Science in Eritrea. A modified Service Performance (SERVPERF) questionnaire and focus group discussions were used to collect the data. The sample population comprised of 135 third-year and fourth-year students’ from both Colleges. A questionnaire using a 5 point Likert-scale was administered to all respondents whilst two focus group discussions were conducted. Findings from survey data and focus group discussions showed that the majority of students hold a positive perception of the quality of course evaluation practice but had a negative perception of methods of awarding grades and administrators’ role in listening to the students complain about the course. Furthermore, the analysis from the questionnaire showed that there is no statistically significant difference between third-year and fourth-year students, College of Education and College of Science and male and female students on the quality of course evaluation practice and feedback. The study recommends that colleges improve the quality of fairness and feedback during course assessment.

Keywords: evaluation, feedback, quality, students' perception

Procedia PDF Downloads 142
21939 Mechanical and Material Characterization on the High Nitrogen Supersaturated Tool Steels for Die-Technology

Authors: Tatsuhiko Aizawa, Hiroshi Morita

Abstract:

The tool steels such as SKD11 and SKH51 have been utilized as punch and die substrates for cold stamping, forging, and fine blanking processes. The heat-treated SKD11 punches with the hardness of 700 HV wrought well in the stamping of SPCC, normal steel plates, and non-ferrous alloy such as a brass sheet. However, they suffered from severe damage in the fine blanking process of smaller holes than 1.5 mm in diameter. Under the high aspect ratio of punch length to diameter, an elastoplastic bucking of slender punches occurred on the production line. The heat-treated punches had a risk of chipping at their edges. To be free from those damages, the blanking punch must have sufficient rigidity and strength at the same time. In the present paper, the small-hole blanking punch with a dual toughness structure was proposed to provide a solution to this engineering issue in production. The low-temperature plasma nitriding process was utilized to form the nitrogen supersaturated thick layer into the original SKD11 punch. Through the plasma nitriding at 673 K for 14.4 ks, the nitrogen supersaturated layer, with the thickness of 50 μm and without nitride precipitates, was formed as a high nitrogen steel (HNS) layer surrounding the original SKD11 punch. In this two-zone structured SKD11 punch, the surface hardness increased from 700 HV for the heat-treated SKD11 to 1400 HV. This outer high nitrogen SKD11 (HN-SKD11) layer had a homogeneous nitrogen solute depth profile with a nitrogen solute content plateau of 4 mass% till the border between the outer HN-SKD11 layer and the original SKD11 matrix. When stamping the brass sheet with the thickness of 1 mm by using this dually toughened SKD11 punch, the punch life was extended from 500 K shots to 10000 K shots to attain a much more stable production line to yield the brass American snaps. Furthermore, with the aid of the masking technique, the punch side surface layer with the thickness of 50 μm was modified by this high nitrogen super-saturation process to have a stripe structure where the un-nitrided SKD11 and the HN-SKD11 layers were alternatively aligned from the punch head to the punch bottom. This flexible structuring promoted the mechanical integrity of total rigidity and toughness as a punch with an extremely small diameter.

Keywords: high nitrogen supersaturation, semi-dry cold stamping, solid solution hardening, tool steel dies, low temperature nitriding, dual toughness structure, extremely small diameter punch

Procedia PDF Downloads 84
21938 Artificial Intelligence in Melanoma Prognosis: A Narrative Review

Authors: Shohreh Ghasemi

Abstract:

Introduction: Melanoma is a complex disease with various clinical and histopathological features that impact prognosis and treatment decisions. Traditional methods of melanoma prognosis involve manual examination and interpretation of clinical and histopathological data by dermatologists and pathologists. However, the subjective nature of these assessments can lead to inter-observer variability and suboptimal prognostic accuracy. AI, with its ability to analyze vast amounts of data and identify patterns, has emerged as a promising tool for improving melanoma prognosis. Methods: A comprehensive literature search was conducted to identify studies that employed AI techniques for melanoma prognosis. The search included databases such as PubMed and Google Scholar, using keywords such as "artificial intelligence," "melanoma," and "prognosis." Studies published between 2010 and 2022 were considered. The selected articles were critically reviewed, and relevant information was extracted. Results: The review identified various AI methodologies utilized in melanoma prognosis, including machine learning algorithms, deep learning techniques, and computer vision. These techniques have been applied to diverse data sources, such as clinical images, dermoscopy images, histopathological slides, and genetic data. Studies have demonstrated the potential of AI in accurately predicting melanoma prognosis, including survival outcomes, recurrence risk, and response to therapy. AI-based prognostic models have shown comparable or even superior performance compared to traditional methods.

Keywords: artificial intelligence, melanoma, accuracy, prognosis prediction, image analysis, personalized medicine

Procedia PDF Downloads 67
21937 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia

Authors: Eyosiyas Aga

Abstract:

The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.

Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service

Procedia PDF Downloads 257
21936 The Rehabilitation of The Covered Bridge Leclerc (P-00249) Passing Over the Bouchard Stream in LaSarre, Quebec

Authors: Nairy Kechichian

Abstract:

The original Leclerc Bridge is a covered wooden bridge that is considered a Quebec heritage structure with an index of 60, making it a very important provincial bridge from a historical point of view. It was constructed in 1927 and is in the rural area of Abitibi-Temiscamingue. It is a “town Québécois” type of structure, which is generally rare but common for covered bridges in Abitibi-Temiscamingue. This type of structure is composed of two trusses on both sides formed with diagonals, internal bracings, uprights and top and bottom chords to allow the transmission of loads. This structure is mostly known for its solidity, lightweightness, and ease of construction. It is a single-span bridge with a length of 25.3 meters and allows the passage of one vehicle at a time with a 4.22-meter driving lane. The structure is composed of 2 trusses located at each end of the deck, two gabion foundations at both ends, uprights and top and bottom chords. WSP (Williams Sale Partnership) Canada inc. was mandated by the Transport Minister of Quebec in 2019 to increase the capacity of the bridge from 5 tons to 30.6 tons and rehabilitate it, as it has deteriorated quite significantly over the years. The bridge was damaged due to material deterioration over time, exposure to humidity, high load effects and insect infestation. To allow the passage of 3 axle trucks, as well as to keep the integrity of this heritage structure, the final design chosen to rehabilitate the bridge involved adding a new deck independent from the roof structure of the bridge. Essentially, new steel beams support the deck loads and the desired vehicle loads. The roof of the bridge is linked to the steel deck for lateral support, but it is isolated from the wooden deck. The roof is preserved for aesthetic reasons and remains intact as it is a heritage piece. Due to strict traffic management obstacles, an efficient construction method was put into place, which consisted of building a temporary bridge and moving the existing roof onto it to allow the circulation of vehicles on one side of the temporary bridge while providing a working space for the repairs of the roof on the other side to take place simultaneously. In parallel, this method allowed the demolition and reconstruction of the existing foundation, building a new steel deck, and transporting back the roof on the new bridge. One of the main criteria for the rehabilitation of the wooden bridge was to preserve, as much as possible, the existing patrimonial architectural design of the bridge. The project was completed successfully by the end of 2021.

Keywords: covered bridge, wood-steel, short span, town Québécois structure

Procedia PDF Downloads 54
21935 Needs of Omani Children in First Grade during Their Transition from Kindergarten to Primary School: An Ethnographic Study

Authors: Zainab Algharibi, Julie McAdam, Catherine Fagan

Abstract:

The purpose of this paper is to shed light on how Omani children in the first grade experience their needs during their transition to primary school. Theoretically, the paper was built on two perspectives: Dewey's concept of continuity of experience and the boundary objects introduced by Vygotsky (CHAT). The methodology of the study is based on the crucial role of children’s agency which is a very important activity as an educational tool to enhance the child’s participation in the learning process and develop their ability to face various issues in their life. Thus, the data were obtained from 45 children in grade one from 4 different primary schools using drawing and visual narrative activities, in addition to researcher observations during the start of the first weeks of the academic year for the first grade. As the study dealt with children, all of the necessary ethical laws were followed. This paper is considered original since it seeks to deal with the issue of children's transition from kindergarten to primary school in Oman, if not in the Arab region. Therefore, it is expected to fill an important gap in this field and present a proposal that will be a door for researchers to enter this research field later. The analysis of drawing and visual narrative was performed according to the social semiotics approach in two phases. The first is to read out the surface message “denotation,” while the second is to go in-depth via the symbolism obtained from children while they talked and drew letters and signs. This stage is known as “signified”; a video was recorded of each child talking about their drawing and expressing themself. Then, the data were organised and classified according to a cross-data network. Regarding the researcher observation analyses, the collected data were analysed according to the model was developed for the "grounded theory". It is based on comparing the recent data collected from observations with data previously encoded by other methods in which children were drawing alongside the visual narrative in the current study, in order to identify the similarities and differences, and also to clarify the meaning of the accessed categories and to identify sub-categories of them with a description of possible links between them. This is a kind of triangulation in data collection. The study came up with a set of findings, the most vital being that the children's greatest interest goes to their social and psychological needs, such as friends, their teacher, and playing. Also, their biggest fears are a new place, a new teacher, and not having friends, while they showed less concern for their need for educational knowledge and skills.

Keywords: children’s academic needs, children’s social needs, transition, primary school

Procedia PDF Downloads 100
21934 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy

Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie

Abstract:

In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.

Keywords: data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data

Procedia PDF Downloads 311
21933 Applications of Drones in Infrastructures: Challenges and Opportunities

Authors: Jin Fan, M. Ala Saadeghvaziri

Abstract:

Unmanned aerial vehicles (UAVs), also referred to as drones, equipped with various kinds of advanced detecting or surveying systems, are effective and low-cost in data acquisition, data delivery and sharing, which can benefit the building of infrastructures. This paper will give an overview of applications of drones in planning, designing, construction and maintenance of infrastructures. The drone platform, detecting and surveying systems, and post-data processing systems will be introduced, followed by cases with details of the applications. Challenges from different aspects will be addressed. Opportunities of drones in infrastructure include but not limited to the following. Firstly, UAVs equipped with high definition cameras or other detecting equipment are capable of inspecting the hard to reach infrastructure assets. Secondly, UAVs can be used as effective tools to survey and map the landscape to collect necessary information before infrastructure construction. Furthermore, an UAV or multi-UVAs are useful in construction management. UVAs can also be used in collecting roads and building information by taking high-resolution photos for future infrastructure planning. UAVs can be used to provide reliable and dynamic traffic information, which is potentially helpful in building smart cities. The main challenges are: limited flight time, the robustness of signal, post data analyze, multi-drone collaboration, weather condition, distractions to the traffic caused by drones. This paper aims to help owners, designers, engineers and architects to improve the building process of infrastructures for higher efficiency and better performance.

Keywords: bridge, construction, drones, infrastructure, information

Procedia PDF Downloads 111
21932 Development of Analytical Systems for Nurses in Kenya

Authors: Peris Wanjiku

Abstract:

The objective of this paper is to describe the development and implications of a national nursing workforce analytical system in Kenya. Findings: Creating a national electronic nursing workforce analytical system provides more reliable information on nurses ‘national demographics, migration patterns, and workforce capacity and efficiency. Data analysis is most useful for human resources for health (HRH) planning when workforce capacity data can be linked to worksite staffing requirements. As a result of establishing this database, the Kenya Ministry of Health has improved its capability to assess its nursing workforce and document important workforce trends, such as out-migration. Current data identify the United States as the leading recipient country of Kenyan nurses. The overwhelming majority of Kenyan nurses who decide to out-migrate are amongst Kenya’s most qualified. Conclusions: The Kenya nursing database is a first step toward facilitating evidence-based decision-making in HRH. This database is unique to developing countries in sub-Saharan Africa. Establishing an electronic workforce database requires long-term investment and sustained support by national and global stakeholders.

Keywords: analytical, information, health, migration

Procedia PDF Downloads 85
21931 Voltage Problem Location Classification Using Performance of Least Squares Support Vector Machine LS-SVM and Learning Vector Quantization LVQ

Authors: M. Khaled Abduesslam, Mohammed Ali, Basher H. Alsdai, Muhammad Nizam Inayati

Abstract:

This paper presents the voltage problem location classification using performance of Least Squares Support Vector Machine (LS-SVM) and Learning Vector Quantization (LVQ) in electrical power system for proper voltage problem location implemented by IEEE 39 bus New-England. The data was collected from the time domain simulation by using Power System Analysis Toolbox (PSAT). Outputs from simulation data such as voltage, phase angle, real power and reactive power were taken as input to estimate voltage stability at particular buses based on Power Transfer Stability Index (PTSI).The simulation data was carried out on the IEEE 39 bus test system by considering load bus increased on the system. To verify of the proposed LS-SVM its performance was compared to Learning Vector Quantization (LVQ). The results showed that LS-SVM is faster and better as compared to LVQ. The results also demonstrated that the LS-SVM was estimated by 0% misclassification whereas LVQ had 7.69% misclassification.

Keywords: IEEE 39 bus, least squares support vector machine, learning vector quantization, voltage collapse

Procedia PDF Downloads 431
21930 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition

Procedia PDF Downloads 139
21929 A Model to Assist Military Mission Planners in Identifying and Assessing Variables Impacting Food Security

Authors: Lynndee Kemmet

Abstract:

The U.S. military plays an increasing role in supporting political stability efforts, and this includes efforts to prevent the food insecurity that can trigger political and social instability. This paper presents a model that assists military commanders in identifying variables that impact food production and distribution in their areas of operation (AO), in identifying connections between variables and in assessing the impacts of those variables on food production and distribution. Through use of the model, military units can better target their data collection efforts and can categorize and analyze data within the data categorization framework most widely-used by military forces—PMESII-PT (Political, Military, Economic, Infrastructure, Information, Physical Environment and Time). The model provides flexibility of analysis in that commanders can target analysis to be highly focused on a specific PMESII-PT domain or variable or conduct analysis across multiple PMESII-PT domains. The model is also designed to assist commanders in mapping food systems in their AOs and then identifying components of those systems that must be strengthened or protected.

Keywords: food security, food system model, political stability, US Military

Procedia PDF Downloads 185
21928 Unseen Classes: The Paradigm Shift in Machine Learning

Authors: Vani Singhal, Jitendra Parmar, Satyendra Singh Chouhan

Abstract:

Unseen class discovery has now become an important part of a machine-learning algorithm to judge new classes. Unseen classes are the classes on which the machine learning model is not trained on. With the advancement in technology and AI replacing humans, the amount of data has increased to the next level. So while implementing a model on real-world examples, we come across unseen new classes. Our aim is to find the number of unseen classes by using a hierarchical-based active learning algorithm. The algorithm is based on hierarchical clustering as well as active sampling. The number of clusters that we will get in the end will give the number of unseen classes. The total clusters will also contain some clusters that have unseen classes. Instead of first discovering unseen classes and then finding their number, we directly calculated the number by applying the algorithm. The dataset used is for intent classification. The target data is the intent of the corresponding query. We conclude that when the machine learning model will encounter real-world data, it will automatically find the number of unseen classes. In the future, our next work would be to label these unseen classes correctly.

Keywords: active sampling, hierarchical clustering, open world learning, unseen class discovery

Procedia PDF Downloads 158
21927 Determining of the Performance of Data Mining Algorithm Determining the Influential Factors and Prediction of Ischemic Stroke: A Comparative Study in the Southeast of Iran

Authors: Y. Mehdipour, S. Ebrahimi, A. Jahanpour, F. Seyedzaei, B. Sabayan, A. Karimi, H. Amirifard

Abstract:

Ischemic stroke is one of the common reasons for disability and mortality. The fourth leading cause of death in the world and the third in some other sources. Only 1/3 of the patients with ischemic stroke fully recover, 1/3 of them end in permanent disability and 1/3 face death. Thus, the use of predictive models to predict stroke has a vital role in reducing the complications and costs related to this disease. Thus, the aim of this study was to specify the effective factors and predict ischemic stroke with the help of DM methods. The present study was a descriptive-analytic study. The population was 213 cases from among patients referring to Ali ibn Abi Talib (AS) Hospital in Zahedan. Data collection tool was a checklist with the validity and reliability confirmed. This study used DM algorithms of decision tree for modeling. Data analysis was performed using SPSS-19 and SPSS Modeler 14.2. The results of the comparison of algorithms showed that CHAID algorithm with 95.7% accuracy has the best performance. Moreover, based on the model created, factors such as anemia, diabetes mellitus, hyperlipidemia, transient ischemic attacks, coronary artery disease, and atherosclerosis are the most effective factors in stroke. Decision tree algorithms, especially CHAID algorithm, have acceptable precision and predictive ability to determine the factors affecting ischemic stroke. Thus, by creating predictive models through this algorithm, will play a significant role in decreasing the mortality and disability caused by ischemic stroke.

Keywords: data mining, ischemic stroke, decision tree, Bayesian network

Procedia PDF Downloads 164
21926 The Beta-Fisher Snedecor Distribution with Applications to Cancer Remission Data

Authors: K. A. Adepoju, O. I. Shittu, A. U. Chukwu

Abstract:

In this paper, a new four-parameter generalized version of the Fisher Snedecor distribution called Beta- F distribution is introduced. The comprehensive account of the statistical properties of the new distributions was considered. Formal expressions for the cumulative density function, moments, moment generating function and maximum likelihood estimation, as well as its Fisher information, were obtained. The flexibility of this distribution as well as its robustness using cancer remission time data was demonstrated. The new distribution can be used in most applications where the assumption underlying the use of other lifetime distributions is violated.

Keywords: fisher-snedecor distribution, beta-f distribution, outlier, maximum likelihood method

Procedia PDF Downloads 338
21925 Threat of Islamic State of Khorasan in Pakistan and Afghanistan Region: Impact on Regional Security

Authors: Irfan U. Din

Abstract:

The growing presence and operational capacity of Islamic State aka Daesh, which emerged in Pak-Afghan region in 2015, poses a serious threat to the already fragile state of the security situation in the region. This paper will shed light on the current state of IS-K network in the Pak-Afghan region and will explain how its presence and operational capacity in the northern and central Afghanistan has increased despite intensive military operations against the group in Nangarhar province – the stronghold of IS-K. It will also explore the role of Pakistani Taliban in the emergence and expansion of IS-K in the region and will unveil the security implication of growing nexus of IS-K and transnational organized groups for the region in Post NATO withdrawal scenario. The study will be qualitative and will rely on secondary and primary data to explore the topic. For secondary data existing literature on the topic will be extensively reviewed while for primary data in-depth interviews will be conducted with subject experts, Taliban commanders, and field researchers.

Keywords: Islamic State of Khorasan (IS-K), North Atlantic Treaty Organization (NATO), Pak-Afghan Region, Transnational Organized Crime (TNOC)

Procedia PDF Downloads 284
21924 A Machine Learning Approach for Classification of Directional Valve Leakage in the Hydraulic Final Test

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Due to increasing cost pressure in global markets, artificial intelligence is becoming a technology that is decisive for competition. Predictive quality enables machinery and plant manufacturers to ensure product quality by using data-driven forecasts via machine learning models as a decision-making basis for test results. The use of cross-process Bosch production data along the value chain of hydraulic valves is a promising approach to classifying the quality characteristics of workpieces.

Keywords: predictive quality, hydraulics, machine learning, classification, supervised learning

Procedia PDF Downloads 221