Search results for: data acquisition
23153 Emotion Mining and Attribute Selection for Actionable Recommendations to Improve Customer Satisfaction
Authors: Jaishree Ranganathan, Poonam Rajurkar, Angelina A. Tzacheva, Zbigniew W. Ras
Abstract:
In today’s world, business often depends on the customer feedback and reviews. Sentiment analysis helps identify and extract information about the sentiment or emotion of the of the topic or document. Attribute selection is a challenging problem, especially with large datasets in actionable pattern mining algorithms. Action Rule Mining is one of the methods to discover actionable patterns from data. Action Rules are rules that help describe specific actions to be made in the form of conditions that help achieve the desired outcome. The rules help to change from any undesirable or negative state to a more desirable or positive state. In this paper, we present a Lexicon based weighted scheme approach to identify emotions from customer feedback data in the area of manufacturing business. Also, we use Rough sets and explore the attribute selection method for large scale datasets. Then we apply Actionable pattern mining to extract possible emotion change recommendations. This kind of recommendations help business analyst to improve their customer service which leads to customer satisfaction and increase sales revenue.Keywords: actionable pattern discovery, attribute selection, business data, data mining, emotion
Procedia PDF Downloads 19923152 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis
Authors: Shriya Shukla, Lachin Fernando
Abstract:
Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning
Procedia PDF Downloads 12523151 An Improved Image Steganography Technique Based on Least Significant Bit Insertion
Authors: Olaiya Folorunsho, Comfort Y. Daramola, Joel N. Ugwu, Lawrence B. Adewole, Olufisayo S. Ekundayo
Abstract:
In today world, there is a tremendous rise in the usage of internet due to the fact that almost all the communication and information sharing is done over the web. Conversely, there is a continuous growth of unauthorized access to confidential data. This has posed a challenge to information security expertise whose major goal is to curtail the menace. One of the approaches to secure the safety delivery of data/information to the rightful destination without any modification is steganography. Steganography is the art of hiding information inside an embedded information. This research paper aimed at designing a secured algorithm with the use of image steganographic technique that makes use of Least Significant Bit (LSB) algorithm for embedding the data into the bit map image (bmp) in order to enhance security and reliability. In the LSB approach, the basic idea is to replace the LSB of the pixels of the cover image with the Bits of the messages to be hidden without destroying the property of the cover image significantly. The system was implemented using C# programming language of Microsoft.NET framework. The performance evaluation of the proposed system was experimented by conducting a benchmarking test for analyzing the parameters like Mean Squared Error (MSE) and Peak Signal to Noise Ratio (PSNR). The result showed that image steganography performed considerably in securing data hiding and information transmission over the networks.Keywords: steganography, image steganography, least significant bits, bit map image
Procedia PDF Downloads 26623150 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change
Authors: Ali Razmi, Saeed Golian
Abstract:
Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.Keywords: climate change, climate variables, copula, joint probability
Procedia PDF Downloads 36023149 Estimation of Source Parameters Using Source Parameters Imaging Method From Digitised High Resolution Airborne Magnetic Data of a Basement Complex
Authors: O. T. Oluriz, O. D. Akinyemi, J. A.Olowofela, O. A. Idowu, S. A. Ganiyu
Abstract:
This study was carried out using aeromagnetic data which record variation in the magnitude of the earth magnetic field in order to detect local changes in the properties of the underlying geology. The aeromagnetic data (Sheet No. 261) was acquired from the archives of Nigeria Geological Survey Agency of Nigeria, obtained in 2009. The study present estimation of source parameters within an area of about 3,025 square kilometers on geographic latitude to and longitude to within Ibadan and it’s environs in Oyo State, southwestern Nigeria. The area under study belongs to part of basement complex in southwestern Nigeria. Estimation of source parameters of aeromagnetic data was achieve through the application of source imaging parameters (SPI) techniques that provide delineation, depth, dip contact, susceptibility contrast and mineral potentials of magnetic signatures within the region. The depth to the magnetic sources in the area ranges from 0.675 km to 4.48 km. The estimated depth limit to shallow sources is 0.695 km and depth to deep sources is 4.48 km. The apparent susceptibility values of the entire study area obtained ranges from 0.01 to 0.005 [SI]. This study has shown that the magnetic susceptibility within study area is controlled mainly by super paramagnetic minerals.Keywords: aeromagnetic, basement complex, meta-sediment, precambrian
Procedia PDF Downloads 43023148 FRATSAN: A New Software for Fractal Analysis of Signals
Authors: Hamidreza Namazi
Abstract:
Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 46723147 An Investigation of Differential Item and Test Functioning of Scholastic Aptitude Test 2011 (SWUSAT 2011)
Authors: Ruangdech Sirikit
Abstract:
The purposes of this study were analyzed differential item functioning and differential test functioning of SWUSAT aptitude test classification by sex variable. The data used in this research is the secondary data from Srinakharinwirot University Scholastic Aptitude Test 2011 (SWUSAT 2011) SWUSAT test consists of four subjects. There are verbal ability test, number ability test, reasoning ability test and spatial ability test. The data analysis was carried out in 2 steps. The first step was analyzing descriptive statistics. In the second step were analyzed differential item functioning (DIF) and differential test functioning (DTF) by using the DIFAS program. The research results were as follows: The results of data analysis for all 10 tests in year 2011. Sex was the characteristic that found DIF all 10 tests. The percentage of item number that found DIF was between 10% - 46.67%. There are 4 tests that most of items favors female group. There are 3 tests that most of items favors male group and there are 3 tests that the number of items favors female group equal favors male group. For Differential test functioning (DTF), there are 8 tests that have small DIF effect variance.Keywords: differential item functioning, differential test functioning, SWUSAT, aptitude test
Procedia PDF Downloads 61123146 Privacy Preservation Concerns and Information Disclosure on Social Networks: An Ongoing Research
Authors: Aria Teimourzadeh, Marc Favier, Samaneh Kakavand
Abstract:
The emergence of social networks has revolutionized the exchange of information. Every behavior on these platforms contributes to the generation of data known as social network data that are processed, stored and published by the social network service providers. Hence, it is vital to investigate the role of these platforms in user data by considering the privacy measures, especially when we observe the increased number of individuals and organizations engaging with the current virtual platforms without being aware that the data related to their positioning, connections and behavior is uncovered and used by third parties. Performing analytics on social network datasets may result in the disclosure of confidential information about the individuals or organizations which are the members of these virtual environments. Analyzing separate datasets can reveal private information about relationships, interests and more, especially when the datasets are analyzed jointly. Intentional breaches of privacy is the result of such analysis. Addressing these privacy concerns requires an understanding of the nature of data being accumulated and relevant data privacy regulations, as well as motivations for disclosure of personal information on social network platforms. Some significant points about how user's online information is controlled by the influence of social factors and to what extent the users are concerned about future use of their personal information by the organizations, are highlighted in this paper. Firstly, this research presents a short literature review about the structure of a network and concept of privacy in Online Social Networks. Secondly, the factors of user behavior related to privacy protection and self-disclosure on these virtual communities are presented. In other words, we seek to demonstrates the impact of identified variables on user information disclosure that could be taken into account to explain the privacy preservation of individuals on social networking platforms. Thirdly, a few research directions are discussed to address this topic for new researchers.Keywords: information disclosure, privacy measures, privacy preservation, social network analysis, user experience
Procedia PDF Downloads 28123145 The Current Status of Middle Class Internet Use in China: An Analysis Based on the Chinese General Social Survey 2015 Data and Semi-Structured Investigation
Authors: Abigail Qian Zhou
Abstract:
In today's China, the well-educated middle class, with stable jobs and above-average income, are the driving force behind its Internet society. Through the analysis of data from the 2015 Chinese General Social Survey and 50 interviewees, this study investigates the current situation of this group’s specific internet usage. The findings of this study demonstrate that daily life among the members of this socioeconomic group is closely tied to the Internet. For Chinese middle class, the Internet is used to socialize and entertain self and others. It is also used to search for and share information as well as to build their identities. The empirical results of this study will provide a reference, supported by factual data, for enterprises seeking to target the Chinese middle class through online marketing efforts.Keywords: middle class, Internet use, network behaviour, online marketing, China
Procedia PDF Downloads 12123144 Nowcasting Indonesian Economy
Authors: Ferry Kurniawan
Abstract:
In this paper, we nowcast quarterly output growth in Indonesia by exploiting higher frequency data (monthly indicators) using a mixed-frequency factor model and exploiting both quarterly and monthly data. Nowcasting quarterly GDP in Indonesia is particularly relevant for the central bank of Indonesia which set the policy rate in the monthly Board of Governors Meeting; whereby one of the important step is the assessment of the current state of the economy. Thus, having an accurate and up-to-date quarterly GDP nowcast every time new monthly information becomes available would clearly be of interest for central bank of Indonesia, for example, as the initial assessment of the current state of the economy -including nowcast- will be used as input for longer term forecast. We consider a small scale mixed-frequency factor model to produce nowcasts. In particular, we specify variables as year-on-year growth rates thus the relation between quarterly and monthly data is expressed in year-on-year growth rates. To assess the performance of the model, we compare the nowcasts with two other approaches: autoregressive model –which is often difficult when forecasting output growth- and Mixed Data Sampling (MIDAS) regression. In particular, both mixed frequency factor model and MIDAS nowcasts are produced by exploiting the same set of monthly indicators. Hence, we compare the nowcasts performance of the two approaches directly. To preview the results, we find that by exploiting monthly indicators using mixed-frequency factor model and MIDAS regression we improve the nowcast accuracy over a benchmark simple autoregressive model that uses only quarterly frequency data. However, it is not clear whether the MIDAS or mixed-frequency factor model is better. Neither set of nowcasts encompasses the other; suggesting that both nowcasts are valuable in nowcasting GDP but neither is sufficient. By combining the two individual nowcasts, we find that the nowcast combination not only increases the accuracy - relative to individual nowcasts- but also lowers the risk of the worst performance of the individual nowcasts.Keywords: nowcasting, mixed-frequency data, factor model, nowcasts combination
Procedia PDF Downloads 33123143 Real-Time Image Encryption Using a 3D Discrete Dual Chaotic Cipher
Authors: M. F. Haroun, T. A. Gulliver
Abstract:
In this paper, an encryption algorithm is proposed for real-time image encryption. The scheme employs a dual chaotic generator based on a three dimensional (3D) discrete Lorenz attractor. Encryption is achieved using non-autonomous modulation where the data is injected into the dynamics of the master chaotic generator. The second generator is used to permute the dynamics of the master generator using the same approach. Since the data stream can be regarded as a random source, the resulting permutations of the generator dynamics greatly increase the security of the transmitted signal. In addition, a technique is proposed to mitigate the error propagation due to the finite precision arithmetic of digital hardware. In particular, truncation and rounding errors are eliminated by employing an integer representation of the data which can easily be implemented. The simple hardware architecture of the algorithm makes it suitable for secure real-time applications.Keywords: chaotic systems, image encryption, non-autonomous modulation, FPGA
Procedia PDF Downloads 50623142 A Secure System for Handling Information from Heterogeous Sources
Authors: Shoohira Aftab, Hammad Afzal
Abstract:
Information integration is a well known procedure to provide consolidated view on sets of heterogeneous information sources. It not only provides better statistical analysis of information but also facilitates users to query without any knowledge on the underlying heterogeneous information sources The problem of providing a consolidated view of information can be handled using Semantic data (information stored in such a way that is understandable by machines and integrate-able without manual human intervention). However, integrating information using semantic web technology without any access management enforced, will results in increase of privacy and confidentiality concerns. In this research we have designed and developed a framework that would allow information from heterogeneous formats to be consolidated, thus resolving the issue of interoperability. We have also devised an access control system for defining explicit privacy constraints. We designed and applied our framework on both semantic and non-semantic data from heterogeneous resources. Our approach is validated using scenario based testing.Keywords: information integration, semantic data, interoperability, security, access control system
Procedia PDF Downloads 35723141 Refractive Index, Excess Molar Volume and Viscometric Study of Binary Liquid Mixture of Morpholine with Cumene at 298.15 K, 303.15 K, and 308.15 K
Authors: B. K. Gill, Himani Sharma, V. K. Rattan
Abstract:
Experimental data of refractive index, excess molar volume and viscosity of binary mixture of morpholine with cumene over the whole composition range at 298.15 K, 303.15 K, 308.15 K and normal atmospheric pressure have been measured. The experimental data were used to compute the density, deviation in molar refraction, deviation in viscosity and excess Gibbs free energy of activation as a function of composition. The experimental viscosity data have been correlated with empirical equations like Grunberg- Nissan, Herric correlation and three body McAllister’s equation. The excess thermodynamic properties were fitted to Redlich-Kister polynomial equation. The variation of these properties with composition and temperature of the binary mixtures are discussed in terms of intermolecular interactions.Keywords: cumene, excess Gibbs free energy, excess molar volume, morpholine
Procedia PDF Downloads 32823140 Anthropometric Data Variation within Gari-Frying Population
Authors: T. M. Samuel, O. O. Aremu, I. O. Ismaila, L. I. Onu, B. O. Adetifa, S. E. Adegbite, O. O. Olokoshe
Abstract:
The imperative of anthropometry in designing to fit cannot be overemphasized. Of essence is the variability of measurements among population for which data is collected. In this paper anthropometric data were collected for the design of gari-frying facility such that work system would be designed to fit the gari-frying population in the Southwestern states of Nigeria comprising Lagos, Ogun, Oyo, Osun, Ondo, and Ekiti. Twenty-seven body dimensions were measured among 120 gari-frying processors. Statistical analysis was performed using SPSS package to determine the mean, standard deviation, minimum value, maximum value and percentiles (2nd, 5th, 25th, 50th, 75th, 95th, and 98th) of the different anthropometric parameters. One sample t-test was conducted to determine the variation within the population. The 50th percentiles of some of the anthropometric parameters were compared with those from other populations in literature. The correlation between the worker’s age and the body anthropometry was also investigated.The mean weight, height, shoulder height (sitting), eye height (standing) and eye height (sitting) are 63.37 kg, 1.57 m, 0.55 m, 1.45 m, and 0.67 m respectively.Result also shows a high correlation with other populations and a statistically significant difference in variability of data within the population in all the body dimensions measured. With a mean age of 42.36 years, results shows that age will be a wrong indicator for estimating the anthropometry for the population.Keywords: anthropometry, cassava processing, design to fit, gari-frying, workstation design
Procedia PDF Downloads 25323139 Discovering Event Outliers for Drug as Commercial Products
Authors: Arunas Burinskas, Aurelija Burinskiene
Abstract:
On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.Keywords: drugs, Grubbs' test, outlier, shortage event
Procedia PDF Downloads 13423138 The Development of Research Based Model to Enhance Critical Thinking, Cognitive Skills and Culture and Local Wisdom Knowledge of Undergraduate Students
Authors: Nithipattara Balsiri
Abstract:
The purposes of this research was to develop instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge of undergraduate students. The sample consisted of 307 undergraduate students. Critical thinking and cognitive skills test were employed for data collection. Second-order confirmatory factor analysis, t-test, and one-way analysis of variance were employed for data analysis using SPSS and LISREL programs. The major research results were as follows; 1) the instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge should be consists of 6 sequential steps, namely (1) the setting research problem (2) the setting research hypothesis (3) the data collection (4) the data analysis (5) the research result conclusion (6) the application for problem solving, and 2) after the treatment undergraduate students possessed a higher scores in critical thinking and cognitive skills than before treatment at the 0.05 level of significance.Keywords: critical thinking, cognitive skills, culture and local wisdom knowledge
Procedia PDF Downloads 36623137 A Case Study of Control of Blast-Induced Ground Vibration on Adjacent Structures
Authors: H. Mahdavinezhad, M. Labbaf, H. R. Tavakoli
Abstract:
In recent decades, the study and control of the destructive effects of explosive vibration in construction projects has received more attention, and several experimental equations in the field of vibration prediction as well as allowable vibration limit for various structures are presented. Researchers have developed a number of experimental equations to estimate the peak particle velocity (PPV), in which the experimental constants must be obtained at the site of the explosion by fitting the data from experimental explosions. In this study, the most important of these equations was evaluated for strong massive conglomerates around Dez Dam by collecting data on explosions, including 30 particle velocities, 27 displacements, 27 vibration frequencies and 27 acceleration of earth vibration at different distances; they were recorded in the form of two types of detonation systems, NUNEL and electric. Analysis showed that the data from the explosion had the best correlation with the cube root of the explosive, R2=0.8636, but overall the correlation coefficients are not much different. To estimate the vibration in this project, data regression was performed in the other formats, which resulted in the presentation of new equation with R2=0.904 correlation coefficient. Finally according to the importance of the studied structures in order to ensure maximum non damage to adjacent structures for each diagram, a range of application was defined so that for distances 0 to 70 meters from blast site, exponent n=0.33 and for distances more than 70 m, n =0.66 was suggested.Keywords: blasting, blast-induced vibration, empirical equations, PPV, tunnel
Procedia PDF Downloads 13123136 Development of a System for Fitting Clothes and Accessories Using Augmented Reality
Authors: Dinmukhamed T., Vassiliy S.
Abstract:
This article suggests the idea of fitting clothes and accessories based on augmented reality. A logical data model has been developed, taking into account the decision-making module (colors, style, type, material, popularity, etc.) based on personal data (age, gender, weight, height, leg size, hoist length, geolocation, photogrammetry, number of purchases of certain types of clothing, etc.) and statistical data of the purchase history (number of items, price, size, color, style, etc.). Also, in order to provide information to the user, it is planned to develop an augmented reality system using a QR code. This system of selection and fitting of clothing and accessories based on augmented reality will be used in stores to reduce the time for the buyer to make a decision on the choice of clothes.Keywords: augmented reality, online store, decision-making module, like QR code, clothing store, queue
Procedia PDF Downloads 15723135 Problem Solving: Process or Product? A Mathematics Approach to Problem Solving in Knowledge Management
Authors: A. Giannakopoulos, S. B. Buckley
Abstract:
Problem solving in any field is recognised as a prerequisite for any advancement in knowledge. For example in South Africa it is one of the seven critical outcomes of education together with critical thinking. As a systematic way to problem solving was initiated in mathematics by the great mathematician George Polya (the father of problem solving), more detailed and comprehensive ways in problem solving have been developed. This paper is based on the findings by the author and subsequent recommendations for further research in problem solving and critical thinking. Although the study was done in mathematics, there is no doubt by now in almost anyone’s mind that mathematics is involved to a greater or a lesser extent in all fields, from symbols, to variables, to equations, to logic, to critical thinking. Therefore it stands to reason that mathematical principles and learning cannot be divorced from any field. In management of knowledge situations, the types of problems are similar to mathematics problems varying from simple to analogical to complex; from well-structured to ill-structured problems. While simple problems could be solved by employees by adhering to prescribed sequential steps (the process), analogical and complex problems cannot be proceduralised and that diminishes the capacity of the organisation of knowledge creation and innovation. The low efficiency in some organisations and the low pass rates in mathematics prompted the author to view problem solving as a product. The authors argue that using mathematical approaches to knowledge management problem solving and treating problem solving as a product will empower the employee through further training to tackle analogical and complex problems. The question the authors asked was: If it is true that problem solving and critical thinking are indeed basic skills necessary for advancement of knowledge why is there so little literature of knowledge management (KM) about them and how they are connected and advance KM?This paper concludes with a conceptual model which is based on general accepted principles of knowledge acquisition (developing a learning organisation), knowledge creation, sharing, disseminating and storing thereof, the five pillars of knowledge management (KM). This model, also expands on Gray’s framework on KM practices and problem solving and opens the doors to a new approach to training employees in general and domain specific areas problems which can be adapted in any type of organisation.Keywords: critical thinking, knowledge management, mathematics, problem solving
Procedia PDF Downloads 59623134 Development of Multi-Leaf Collimator-Based Isocenter Verification Tool Using Electrical Portal Imaging Device for Stereotactic Radiosurgery
Authors: Panatda Intanin, Sangutid Thongsawad, Chirapha Tannanonta, Todsaporn Fuangrod
Abstract:
Stereotactic radiosurgery (SRS) is a highly precision delivery technique that requires comprehensive quality assurance (QA) tests prior to treatment delivery. An isocenter of delivery beam plays a critical role that affect the treatment accuracy. The uncertainty of isocenter is traditionally accessed using circular cone equipment, Winston-Lutz (WL) phantom and film. This technique is considered time consuming and highly dependent on the observer. In this work, the development of multileaf collimator (MLC)-based isocenter verification tool using electronic portal imaging device (EPID) was proposed and evaluated. A mechanical isocenter alignment with ball bearing diameter 5 mm and circular cone diameter 10 mm fixed to gantry head defines the radiation field was set as the conventional WL test method. The conventional setup was to compare to the proposed setup; using MLC (10 x 10 mm) to define the radiation filed instead of cone. This represents more realistic delivery field than using circular cone equipment. The acquisition from electronic portal imaging device (EPID) and radiographic film were performed in both experiments. The gantry angles were set as following: 0°, 90°, 180° and 270°. A software tool was in-house developed using MATLAB/SIMULINK programming to determine the centroid of radiation field and shadow of WL phantom automatically. This presents higher accuracy than manual measurement. The deviation between centroid of both cone-based and MLC-based WL tests were quantified. To compare between film and EPID image, the deviation for all gantry angle was 0.26±0.19mm and 0.43±0.30 for cone-based and MLC-based WL tests. For the absolute deviation calculation on EPID images between cone and MLC-based WL test was 0.59±0.28 mm and the absolute deviation on film images was 0.14±0.13 mm. Therefore, the MLC-based isocenter verification using EPID present high sensitivity tool for SRS QA.Keywords: isocenter verification, quality assurance, EPID, SRS
Procedia PDF Downloads 15223133 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI
Authors: Genady Grabarnik, Serge Yaskolko
Abstract:
Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education
Procedia PDF Downloads 5823132 Re-identification Risk and Mitigation in Federated Learning: Human Activity Recognition Use Case
Authors: Besma Khalfoun
Abstract:
In many current Human Activity Recognition (HAR) applications, users' data is frequently shared and centrally stored by third parties, posing a significant privacy risk. This practice makes these entities attractive targets for extracting sensitive information about users, including their identity, health status, and location, thereby directly violating users' privacy. To tackle the issue of centralized data storage, a relatively recent paradigm known as federated learning has emerged. In this approach, users' raw data remains on their smartphones, where they train the HAR model locally. However, users still share updates of their local models originating from raw data. These updates are vulnerable to several attacks designed to extract sensitive information, such as determining whether a data sample is used in the training process, recovering the training data with inversion attacks, or inferring a specific attribute or property from the training data. In this paper, we first introduce PUR-Attack, a parameter-based user re-identification attack developed for HAR applications within a federated learning setting. It involves associating anonymous model updates (i.e., local models' weights or parameters) with the originating user's identity using background knowledge. PUR-Attack relies on a simple yet effective machine learning classifier and produces promising results. Specifically, we have found that by considering the weights of a given layer in a HAR model, we can uniquely re-identify users with an attack success rate of almost 100%. This result holds when considering a small attack training set and various data splitting strategies in the HAR model training. Thus, it is crucial to investigate protection methods to mitigate this privacy threat. Along this path, we propose SAFER, a privacy-preserving mechanism based on adaptive local differential privacy. Before sharing the model updates with the FL server, SAFER adds the optimal noise based on the re-identification risk assessment. Our approach can achieve a promising tradeoff between privacy, in terms of reducing re-identification risk, and utility, in terms of maintaining acceptable accuracy for the HAR model.Keywords: federated learning, privacy risk assessment, re-identification risk, privacy preserving mechanisms, local differential privacy, human activity recognition
Procedia PDF Downloads 1123131 Blockchain for Transport: Performance Simulations of Blockchain Network for Emission Monitoring Scenario
Authors: Dermot O'Brien, Vasileios Christaras, Georgios Fontaras, Igor Nai Fovino, Ioannis Kounelis
Abstract:
With the rise of the Internet of Things (IoT), 5G, and blockchain (BC) technologies, vehicles are becoming ever increasingly connected and are already transmitting substantial amounts of data to the original equipment manufacturers (OEMs) servers. This data could be used to help detect mileage fraud and enable more accurate vehicle emissions monitoring. This would not only help regulators but could enable applications such as permitting efficient drivers to pay less tax, geofencing for air quality improvement, as well as pollution tolling and trading platforms for transport-related businesses and EU citizens. Other applications could include traffic management and shared mobility systems. BC enables the transmission of data with additional security and removes single points of failure while maintaining data provenance, identity ownership, and the possibility to retain varying levels of privacy depending on the requirements of the applied use case. This research performs simulations of vehicles interacting with European member state authorities and European Commission BC nodes that are running hyperleger fabric and explores whether the technology is currently feasible for transport applications such as the emission monitoring use-case.Keywords: future transportation systems, technological innovations, policy approaches for transportation future, economic and regulatory trends, blockchain
Procedia PDF Downloads 17623130 DURAFILE: A Collaborative Tool for Preserving Digital Media Files
Authors: Santiago Macho, Miquel Montaner, Raivo Ruusalepp, Ferran Candela, Xavier Tarres, Rando Rostok
Abstract:
During our lives, we generate a lot of personal information such as photos, music, text documents and videos that link us with our past. This data that used to be tangible is now digital information stored in our computers, which implies a software dependence to make them accessible in the future. Technology, however, constantly evolves and goes through regular shifts, quickly rendering various file formats obsolete. The need for accessing data in the future affects not only personal users but also organizations. In a digital environment, a reliable preservation plan and the ability to adapt to fast changing technology are essential for maintaining data collections in the long term. We present in this paper the European FP7 project called DURAFILE that provides the technology to preserve media files for personal users and organizations while maintaining their quality.Keywords: artificial intelligence, digital preservation, social search, digital preservation plans
Procedia PDF Downloads 44523129 Constructing a Semi-Supervised Model for Network Intrusion Detection
Authors: Tigabu Dagne Akal
Abstract:
While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.Keywords: intrusion detection, data mining, computer science, data mining
Procedia PDF Downloads 29623128 Academic Leadership Succession Planning Practice in Nigeria Higher Education Institutions: A Case Study of Colleges of Education
Authors: Adie, Julius Undiukeye
Abstract:
This research investigated the practice of academic leadership succession planning in Nigerian higher education institutions, drawing on the lived experiences of the academic staff of the case study institutions. It is multi-case study research that adopts a qualitative research method. Ten participants (mainly academic staff) were used as the study sample. The study was guided by four research questions. Semi-structured interviews and archival information from official documents formed the sources of data. The data collected was analyzed using the Constant Comparative Technique (CCT) to generate empirical insights and facts on the subject of this paper. The following findings emerged from the data analysis: firstly, there was no formalized leadership succession plan in place in the institutions that were sampled for this study; secondly, despite the absence of a formal succession plan, the data indicates that academics believe that succession planning is very significant for institutional survival; thirdly, existing practices of succession planning in the sampled institutions, takes the forms of job seniority ranking, political process and executive fiat, ad-hoc arrangement, and external hiring; and finally, data revealed that there are some barriers to the practice of succession planning, such as traditional higher education institutions’ characteristics (e.g. external talent search, shared governance, diversity, and equality in leadership appointment) and the lack of interest in leadership positions. Based on the research findings, some far-reaching recommendations were made, including the urgent need for the ‘formalization’ of leadership succession planning by the higher education institutions concerned, through the design of an official policy framework.Keywords: academic leadership, succession, planning, higher education
Procedia PDF Downloads 14323127 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML
Procedia PDF Downloads 13723126 Integration of Internet-Accessible Resources in the Field of Mobile Robots
Authors: B. Madhevan, R. Sakkaravarthi, R. Diya
Abstract:
The number and variety of mobile robot applications are increasing day by day, both in an industry and in our daily lives. First developed as a tool, nowadays mobile robots can be integrated as an entity in Internet-accessible resources. The present work is organized around four potential resources such as cloud computing, Internet of things, Big data analysis and Co-simulation. Further, the focus relies on integrating, analyzing and discussing the need for integrating Internet-accessible resources and the challenges deriving from such integration, and how these issues have been tackled. Hence, the research work investigates the concepts of the Internet-accessible resources from the aspect of the autonomous mobile robots with an overview of the performances of the currently available database systems. IaR is a world-wide network of interconnected objects, can be considered an evolutionary process in mobile robots. IaR constitutes an integral part of future Internet with data analysis, consisting of both physical and virtual things.Keywords: internet-accessible resources, cloud computing, big data analysis, internet of things, mobile robot
Procedia PDF Downloads 38923125 The Application of Lean-Kaizen in Course Plan and Delivery in Malaysian Higher Education Sector
Authors: Nur Aishah Binti Awi, Zulfiqar Khan
Abstract:
Lean-kaizen has always been applied in manufacturing sector since many years ago. What about education sector? This paper discuss on how lean-kaizen can also be applied in education sector, specifically in academic area of Malaysian’s higher education sector. The purpose of this paper is to describe the application of lean kaizen in course plan and delivery. Lean-kaizen techniques have been used to identify waste in the course plan and delivery. A field study has been conducted to obtain the data. This study used both quantitative and qualitative data. The researcher had interviewed the chosen lecturers regarding to the problems of course plan and delivery that they encountered. Secondary data of students’ feedback at the end of semester also has been used to improve course plan and delivery. The result empirically shows that lean-kaizen helps to improve the course plan and delivery by reducing the wastes. Thus, this study demonstrates that lean-kaizen can also help education sector to improve their services as achieved by manufacturing sector.Keywords: course delivery, education, Kaizen, lean
Procedia PDF Downloads 36823124 An ANN Approach for Detection and Localization of Fatigue Damage in Aircraft Structures
Authors: Reza Rezaeipour Honarmandzad
Abstract:
In this paper we propose an ANN for detection and localization of fatigue damage in aircraft structures. We used network of piezoelectric transducers for Lamb-wave measurements in order to calculate damage indices. Data gathered by the sensors was given to neural network classifier. A set of neural network electors of different architecture cooperates to achieve consensus concerning the state of each monitored path. Sensed signal variations in the ROI, detected by the networks at each path, were used to assess the state of the structure as well as to localize detected damage and to filter out ambient changes. The classifier has been extensively tested on large data sets acquired in the tests of specimens with artificially introduced notches as well as the results of numerous fatigue experiments. Effect of the classifier structure and test data used for training on the results was evaluated.Keywords: ANN, fatigue damage, aircraft structures, piezoelectric transducers, lamb-wave measurements
Procedia PDF Downloads 417