Search results for: support vector machines application
14486 Fraud Detection in Credit Cards with Machine Learning
Authors: Anjali Chouksey, Riya Nimje, Jahanvi Saraf
Abstract:
Online transactions have increased dramatically in this new ‘social-distancing’ era. With online transactions, Fraud in online payments has also increased significantly. Frauds are a significant problem in various industries like insurance companies, baking, etc. These frauds include leaking sensitive information related to the credit card, which can be easily misused. Due to the government also pushing online transactions, E-commerce is on a boom. But due to increasing frauds in online payments, these E-commerce industries are suffering a great loss of trust from their customers. These companies are finding credit card fraud to be a big problem. People have started using online payment options and thus are becoming easy targets of credit card fraud. In this research paper, we will be discussing machine learning algorithms. We have used a decision tree, XGBOOST, k-nearest neighbour, logistic-regression, random forest, and SVM on a dataset in which there are transactions done online mode using credit cards. We will test all these algorithms for detecting fraud cases using the confusion matrix, F1 score, and calculating the accuracy score for each model to identify which algorithm can be used in detecting frauds.Keywords: machine learning, fraud detection, artificial intelligence, decision tree, k nearest neighbour, random forest, XGBOOST, logistic regression, support vector machine
Procedia PDF Downloads 14614485 Security Risks Assessment: A Conceptualization and Extension of NFC Touch-And-Go Application
Authors: Ku Aina Afiqah Ku Adzman, Manmeet Mahinderjit Singh, Zarul Fitri Zaaba
Abstract:
NFC operates on low-range 13.56 MHz frequency within a distance from 4cm to 10cm, and the applications can be categorized as touch and go, touch and confirm, touch and connect, and touch and explore. NFC applications are vulnerable to various security and privacy attacks such due to its physical nature; unprotected data stored in NFC tag and insecure communication between its applications. This paper aims to determine the likelihood of security risks happening in an NFC technology and application. We present an NFC technology taxonomy covering NFC standards, types of application and various security and privacy attack. Based on observations and the survey presented to evaluate the risk assessment within the touch and go application demonstrates two security attacks that are high risks namely data corruption and DOS attacks. After the risks are determined, risk countermeasures by using AHP is adopted. The guideline and solutions to these two high risks, attacks are later applied to a secure NFC-enabled Smartphone Attendance System.Keywords: Near Field Communication (NFC), risk assessment, multi-criteria decision making, Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 30114484 Challenges of the Implementation of Real Time Online Learning in a South African Context
Authors: Thifhuriwi Emmanuel Madzunye, Patricia Harpur, Ephias Ruhode
Abstract:
A review of the pertinent literature identified a gap concerning the hindrances and opportunities accompanying the implementation of real-time online learning systems (RTOLs) in rural areas. Whilst RTOLs present a possible solution to teaching and learning issues in rural areas, little is known about the implementation of digital strategies among schools in isolated communities. This study explores associated guidelines that have the potential to inform decision-making where Internet-based education could improve educational opportunities. A systematic literature review has the potential to consolidate and focus on disparate literature served to collect interlinked data from specific sources in a structured manner. During qualitative data analysis (QDA) of selected publications via the application of a QDA tool - ATLAS.ti, the following overarching themes emerged: digital divide, educational strategy, human factors, and support. Furthermore, findings from data collection and literature review suggest that signiant factors include a lack of digital knowledge, infrastructure shortcomings such as a lack of computers, poor internet connectivity, and handicapped real-time online may limit students’ progress. The study recommends that timeous consideration should be given to the influence of the digital divide. Additionally, the evolution of educational strategy that adopts digital approaches, a focus on training of role-players and stakeholders concerning human factors, and the seeking of governmental funding and support are essential to the implementation and success of RTOLs.Keywords: communication, digital divide, digital skills, distance, educational strategy, government, ICT, infrastructures, learners, limpopo, lukalo, network, online learning systems, political-unrest, real-time, real-time online learning, real-time online learning system, pass-rate, resources, rural area, school, support, teachers, teaching and learning and training
Procedia PDF Downloads 33314483 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 4214482 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.Keywords: internet optimization, video download, future demands, secure storage
Procedia PDF Downloads 13514481 Comparison of Web Development Using Framework over Library
Authors: Syamsul Syafiq, Maslina Daud, Hafizah Hasan, Ahmad Zairi, Shazil Imri, Ezaini Akmar, Norbazilah Rahim
Abstract:
Over recent years, web development has changed significantly. Driven largely by the rise of trends like mobiles, the world of development is rapidly evolving. The rise of the Internet makes web applications crucial nowadays. The web application has been an interface for a company and one of the ways they present their portfolio to the client. On the other hand, the web has become part of the file management system which takes over the role of paper. Due to high demand in web applications, developers are required to develop a web application that are cost-effective, secure and well coded. A framework has been proposed to develop an application rather than using library style development. The framework is helping the developer in creating the structure of a web automatically. This paper will compare the advantages and disadvantages of web development using framework against library-style development. This comparison is based on a previous research paper focusing on two main indicators, which are the impact to management and impact to the developer.Keywords: framework, library style development, web application development, traditional web, static web, dynamic web
Procedia PDF Downloads 22214480 Application of Granular Computing Paradigm in Knowledge Induction
Authors: Iftikhar U. Sikder
Abstract:
This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.Keywords: concept approximation, granular computing, reducts, rough set theory, rule induction
Procedia PDF Downloads 52914479 APP-Based Language Teaching Using Mobile Response System in the Classroom
Authors: Martha Wilson
Abstract:
With the peak of Computer-Assisted Language Learning slowly coming to pass and Mobile-Assisted Language Learning, at times, a bit lacking in the communicative department, we are now faced with a challenging question: How can we engage the interest of our digital native students and, most importantly, sustain it? As previously mentioned, our classrooms are now experiencing an influx of “digital natives” – people who have grown up using and having unlimited access to technology. While modernizing our curriculum and digitalizing our classrooms are necessary in order to accommodate this new learning style, it is a huge financial burden and a massive undertaking for language institutes. Instead, opting for a more compact, simple, yet multidimensional pedagogical tool may be the solution to the issue at hand. This paper aims to give a brief overview into an existing device referred to as Student Response Systems (SRS) and to expand on this notion to include a new prototype of response system that will be designed as a mobile application to eliminate the need for costly hardware and software. Additionally, an analysis into recent attempts by other institutes to develop the Mobile Response System (MRS) and customer reviews of the existing MRSs will be provided, as well as the lessons learned from those projects. Finally, while the new model of MRS is still in its infancy stage, this paper will discuss the implications of incorporating such an application as a tool to support and to enrich traditional techniques and also offer practical classroom applications with the existing response systems that are immediately available on the market.Keywords: app, clickers, mobile app, mobile response system, student response system
Procedia PDF Downloads 37014478 Algorithm for Modelling Land Surface Temperature and Land Cover Classification and Their Interaction
Authors: Jigg Pelayo, Ricardo Villar, Einstine Opiso
Abstract:
The rampant and unintended spread of urban areas resulted in increasing artificial component features in the land cover types of the countryside and bringing forth the urban heat island (UHI). This paved the way to wide range of negative influences on the human health and environment which commonly relates to air pollution, drought, higher energy demand, and water shortage. Land cover type also plays a relevant role in the process of understanding the interaction between ground surfaces with the local temperature. At the moment, the depiction of the land surface temperature (LST) at city/municipality scale particularly in certain areas of Misamis Oriental, Philippines is inadequate as support to efficient mitigations and adaptations of the surface urban heat island (SUHI). Thus, this study purposely attempts to provide application on the Landsat 8 satellite data and low density Light Detection and Ranging (LiDAR) products in mapping out quality automated LST model and crop-level land cover classification in a local scale, through theoretical and algorithm based approach utilizing the principle of data analysis subjected to multi-dimensional image object model. The paper also aims to explore the relationship between the derived LST and land cover classification. The results of the presented model showed the ability of comprehensive data analysis and GIS functionalities with the integration of object-based image analysis (OBIA) approach on automating complex maps production processes with considerable efficiency and high accuracy. The findings may potentially lead to expanded investigation of temporal dynamics of land surface UHI. It is worthwhile to note that the environmental significance of these interactions through combined application of remote sensing, geographic information tools, mathematical morphology and data analysis can provide microclimate perception, awareness and improved decision-making for land use planning and characterization at local and neighborhood scale. As a result, it can aid in facilitating problem identification, support mitigations and adaptations more efficiently.Keywords: LiDAR, OBIA, remote sensing, local scale
Procedia PDF Downloads 28114477 The Artificial Intelligence Technologies Used in PhotoMath Application
Authors: Tala Toonsi, Marah Alagha, Lina Alnowaiser, Hala Rajab
Abstract:
This report is about the Photomath app, which is an AI application that uses image recognition technology, specifically optical character recognition (OCR) algorithms. The (OCR) algorithm translates the images into a mathematical equation, and the app automatically provides a step-by-step solution. The application supports decimals, basic arithmetic, fractions, linear equations, and multiple functions such as logarithms. Testing was conducted to examine the usage of this app, and results were collected by surveying ten participants. Later, the results were analyzed. This paper seeks to answer the question: To what level the artificial intelligence features are accurate and the speed of process in this app. It is hoped this study will inform about the efficiency of AI in Photomath to the users.Keywords: photomath, image recognition, app, OCR, artificial intelligence, mathematical equations.
Procedia PDF Downloads 17014476 Digital Development of Cultural Heritage: Construction of Traditional Chinese Pattern Database
Authors: Shaojian Li
Abstract:
The traditional Chinese patterns, as an integral part of Chinese culture, possess unique values in history, culture, and art. However, with the passage of time and societal changes, many of these traditional patterns are at risk of being lost, damaged, or forgotten. To undertake the digital preservation and protection of these traditional patterns, this paper will collect and organize images of traditional Chinese patterns. It will provide exhaustive and comprehensive semantic annotations, creating a resource library of traditional Chinese pattern images. This will support the digital preservation and application of traditional Chinese patterns.Keywords: digitization of cultural heritage, traditional Chinese patterns, digital humanities, database construction
Procedia PDF Downloads 5814475 Comparison of Catalyst Support for High Pressure Reductive Amination
Authors: Tz-Bang Du, Cheng-Han Hsieh, Li-Ping Ju, Hung-Jie Liou
Abstract:
Polyether amines synthesize by secondary hydroxyl polyether diol play an important role in epoxy hardener. The low molecular weight product is used in low viscosity and high transparent polyamine product for the logo, ground cover, especially for wind turbine blade, while the high molecular weight products are used in advanced agricultures such as a high-speed railway. High-pressure reductive amination process is required for producing these amines. In the condition of higher than 150 atm pressure and 200 degrees Celsius temperature, supercritical ammonia is used as a reactant and also a solvent. It would be a great challenge to select a catalyst support for such high-temperature alkaline circumstance. In this study, we have established a six-autoclave-type (SAT) high-pressure reactor for amination catalyst screening, which six experiment conditions with different temperature and pressure could be examined at the same time. We synthesized copper-nickel catalyst on different shaped alumina catalyst support and evaluated the catalyst activity for high-pressure reductive amination of polypropylene glycol (PPG) by SAT reactor. Ball type gamma alumina, ball type activated alumina and pellet type gamma alumina catalyst supports are evaluated in this study. Gamma alumina supports have shown better activity on PPG reductive amination than activated alumina support. In addition, the catalysts are evaluated in fixed bed reactor. The diamine product was successfully synthesized via this catalyst and the strength of the catalysts is measured. The crush strength of blank supports is about 13.5 lb for both gamma alumina and activated alumina. The strength increases to 20.3 lb after synthesized to be copper-nickel catalyst. After test in the fixed bed high-pressure reductive amination process for 100 hours, the crush strength of the used catalyst is 3.7 lb for activated alumina support, 12.0 lb for gamma alumina support. The gamma alumina is better than activated alumina to use as catalyst support in high-pressure reductive amination process.Keywords: high pressure reductive amination, copper nickel catalyst, polyether amine, alumina
Procedia PDF Downloads 22814474 Forecasting Regional Data Using Spatial Vars
Authors: Taisiia Gorshkova
Abstract:
Since the 1980s, spatial correlation models have been used more often to model regional indicators. An increasingly popular method for studying regional indicators is modeling taking into account spatial relationships between objects that are part of the same economic zone. In 2000s the new class of model – spatial vector autoregressions was developed. The main difference between standard and spatial vector autoregressions is that in the spatial VAR (SpVAR), the values of indicators at time t may depend on the values of explanatory variables at the same time t in neighboring regions and on the values of explanatory variables at time t-k in neighboring regions. Thus, VAR is a special case of SpVAR in the absence of spatial lags, and the spatial panel data model is a special case of spatial VAR in the absence of time lags. Two specifications of SpVAR were applied to Russian regional data for 2000-2017. The values of GRP and regional CPI are used as endogenous variables. The lags of GRP, CPI and the unemployment rate were used as explanatory variables. For comparison purposes, the standard VAR without spatial correlation was used as “naïve” model. In the first specification of SpVAR the unemployment rate and the values of depending variables, GRP and CPI, in neighboring regions at the same moment of time t were included in equations for GRP and CPI respectively. To account for the values of indicators in neighboring regions, the adjacency weight matrix is used, in which regions with a common sea or land border are assigned a value of 1, and the rest - 0. In the second specification the values of depending variables in neighboring regions at the moment of time t were replaced by these values in the previous time moment t-1. According to the results obtained, when inflation and GRP of neighbors are added into the model both inflation and GRP are significantly affected by their previous values, and inflation is also positively affected by an increase in unemployment in the previous period and negatively affected by an increase in GRP in the previous period, which corresponds to economic theory. GRP is not affected by either the inflation lag or the unemployment lag. When the model takes into account lagged values of GRP and inflation in neighboring regions, the results of inflation modeling are practically unchanged: all indicators except the unemployment lag are significant at a 5% significance level. For GRP, in turn, GRP lags in neighboring regions also become significant at a 5% significance level. For both spatial and “naïve” VARs the RMSE were calculated. The minimum RMSE are obtained via SpVAR with lagged explanatory variables. Thus, according to the results of the study, it can be concluded that SpVARs can accurately model both the actual values of macro indicators (particularly CPI and GRP) and the general situation in the regionsKeywords: forecasting, regional data, spatial econometrics, vector autoregression
Procedia PDF Downloads 14114473 Softening Finishing: Teaching and Learning Materials
Authors: C.W. Kan
Abstract:
Softening applied on textile products based on several reasons. First, the synthetic detergent removes natural oils and waxes, thus lose the softness. Second, compensate the harsh handle of resin finishing. Also, imitate natural fibres and improve the comfort of fabric are the reasons to apply softening. There are different types of softeners for softening finishing of textiles, nonionic softener, anionic softener, cationic softener and silicone softener. The aim of this study is to illustrate the proper application of different softeners and their final softening effect in textiles. The results could also provide guidance note to the students in learning this topic. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.Keywords: learning materials, softening, textiles, effect
Procedia PDF Downloads 21614472 Determinants of Economic Growth in Pakistan: A Structural Vector Auto Regression Approach
Authors: Muhammad Ajmair
Abstract:
This empirical study followed structural vector auto regression (SVAR) approach proposed by the so-called AB-model of Amisano and Giannini (1997) to check the impact of relevant macroeconomic determinants on economic growth in Pakistan. Before that auto regressive distributive lag (ARDL) bound testing technique and time varying parametric approach along with general to specific approach was employed to find out relevant significant determinants of economic growth. To our best knowledge, no author made such a study that employed auto regressive distributive lag (ARDL) bound testing and time varying parametric approach with general to specific approach in empirical literature, but current study will bridge this gap. Annual data was taken from World Development Indicators (2014) during period 1976-2014. The widely-used Schwarz information criterion and Akaike information criterion were considered for the lag length in each estimated equation. Main findings of the study are that remittances received, gross national expenditures and inflation are found to be the best relevant positive and significant determinants of economic growth. Based on these empirical findings, we conclude that government should focus on overall economic growth augmenting factors while formulating any policy relevant to the concerned sector.Keywords: economic growth, gross national expenditures, inflation, remittances
Procedia PDF Downloads 19714471 Spatiotemporal Analysis of Visual Evoked Responses Using Dense EEG
Authors: Rima Hleiss, Elie Bitar, Mahmoud Hassan, Mohamad Khalil
Abstract:
A comprehensive study of object recognition in the human brain requires combining both spatial and temporal analysis of brain activity. Here, we are mainly interested in three issues: the time perception of visual objects, the ability of discrimination between two particular categories (objects vs. animals), and the possibility to identify a particular spatial representation of visual objects. Our experiment consisted of acquiring dense electroencephalographic (EEG) signals during a picture-naming task comprising a set of objects and animals’ images. These EEG responses were recorded from nine participants. In order to determine the time perception of the presented visual stimulus, we analyzed the Event Related Potentials (ERPs) derived from the recorded EEG signals. The analysis of these signals showed that the brain perceives animals and objects with different time instants. Concerning the discrimination of the two categories, the support vector machine (SVM) was applied on the instantaneous EEG (excellent temporal resolution: on the order of millisecond) to categorize the visual stimuli into two different classes. The spatial differences between the evoked responses of the two categories were also investigated. The results showed a variation of the neural activity with the properties of the visual input. Results showed also the existence of a spatial pattern of electrodes over particular regions of the scalp in correspondence to their responses to the visual inputs.Keywords: brain activity, categorization, dense EEG, evoked responses, spatio-temporal analysis, SVM, time perception
Procedia PDF Downloads 42114470 Foundation Phase Teachers' Experiences of School Based Support Teams: A Case of Selected Schools in Johannesburg
Authors: Ambeck Celyne Tebid, Harry S. Rampa
Abstract:
The South African Education system recognises the need for all learners including those experiencing learning difficulties, to have access to a single unified system of education. For teachers to be pedagogically responsive to an increasingly diverse learner population without appropriate support has been proven to be unrealistic. As such, this has considerably hampered interest amongst teachers, especially those at the foundation phase to work within an Inclusive Education (IE) and training system. This qualitative study aimed at investigating foundation phase teachers’ experiences of school-based support teams (SBSTs) in two Full-Service (inclusive schools) and one Mainstream public primary school in the Gauteng province of South Africa; with particular emphasis on finding ways to supporting them, since teachers claimed they were not empowered in their initial training to teach learners experiencing learning difficulties. Hence, SBSTs were created at school levels to fill this gap thereby, supporting teaching and learning by identifying and addressing learners’, teachers’ and schools’ needs. With the notion that IE may be failing because of systemic reasons, this study uses Bronfenbrenner’s (1979) ecosystemic as well as Piaget’s (1980) maturational theory to examine the nature of support and experiences amongst teachers taking individual and systemic factors into consideration. Data was collected using in-depth, face-to-face interviews, document analysis and observation with 6 foundation phase teachers drawn from 3 different schools, 3 SBST coordinators, and 3 school principals. Data was analysed using the phenomenological data analysis method. Amongst the findings of the study is that South African full- service and mainstream schools have functional SBSTs which render formal and informal support to the teachers; this support varies in quality depending on the socio-economic status of the relevant community where the schools are situated. This paper, however, argues that what foundation phase teachers settled for as ‘support’ is flawed; as well as how they perceive the SBST and its role is problematic. The paper conclude by recommending that, the SBST should consider other approaches at foundation phase teacher support such as, empowering teachers with continuous practical experiences on how to deal with real classroom scenarios, as well as ensuring that all support, be it on academic or non-academic issues should be provided within a learning community framework where the teacher, family, SBST and where necessary, community organisations should harness their skills towards a common goal.Keywords: foundation phase, full- service schools, inclusive education, learning difficulties, school-based support teams, teacher support
Procedia PDF Downloads 23414469 Cable De-Commissioning of Legacy Accelerators at CERN
Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson
Abstract:
CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.Keywords: CERN, de-cabling, injectors, quality assurance procedure
Procedia PDF Downloads 9114468 Flexural Performance of the Sandwich Structures Having Aluminum Foam Core with Different Thicknesses
Authors: Emre Kara, Ahmet Fatih Geylan, Kadir Koç, Şura Karakuzu, Metehan Demir, Halil Aykul
Abstract:
The structures obtained with the use of sandwich technologies combine low weight with high energy absorbing capacity and load carrying capacity. Hence, there is a growing and markedly interest in the use of sandwiches with aluminium foam core because of very good properties such as flexural rigidity and energy absorption capability. The static (bending and penetration) and dynamic (dynamic bending and low velocity impact) tests were already performed on the aluminum foam cored sandwiches with different types of outer skins by some of the authors. In the current investigation, the static three-point bending tests were carried out on the sandwiches with aluminum foam core and glass fiber reinforced polymer (GFRP) skins at different values of support span distances (L= 55, 70, 80, 125 mm) aiming the analyses of their flexural performance. The influence of the core thickness and the GFRP skin type was reported in terms of peak load, energy absorption capacity and energy efficiency. For this purpose, the skins with two different types of fabrics ([0°/90°] cross ply E-Glass Woven and [0°/90°] cross ply S-Glass Woven which have same thickness value of 1.5 mm) and the aluminum foam core with two different thicknesses (h=10 and 15 mm) were bonded with a commercial polyurethane based flexible adhesive in order to combine the composite sandwich panels. The GFRP skins fabricated via Vacuum Assisted Resin Transfer Molding (VARTM) technique used in the study can be easily bonded to the aluminum foam core and it is possible to configure the base materials (skin, adhesive and core), fiber angle orientation and number of layers for a specific application. The main results of the bending tests are: force-displacement curves, peak force values, absorbed energy, energy efficiency, collapse mechanisms and the effect of the support span length and core thickness. The results of the experimental study showed that the sandwich with the skins made of S-Glass Woven fabrics and with the thicker foam core presented higher mechanical values such as load carrying and energy absorption capacities. The increment of the support span distance generated the decrease of the mechanical values for each type of panels, as expected, because of the inverse proportion between the force and span length. The most common failure types of the sandwiches are debonding of the upper or lower skin and the core shear. The obtained results have particular importance for applications that require lightweight structures with a high capacity of energy dissipation, such as the transport industry (automotive, aerospace, shipbuilding and marine industry), where the problems of collision and crash have increased in the last years.Keywords: aluminum foam, composite panel, flexure, transport application
Procedia PDF Downloads 33714467 Empirical and Indian Automotive Equity Portfolio Decision Support
Authors: P. Sankar, P. James Daniel Paul, Siddhant Sahu
Abstract:
A brief review of the empirical studies on the methodology of the stock market decision support would indicate that they are at a threshold of validating the accuracy of the traditional and the fuzzy, artificial neural network and the decision trees. Many researchers have been attempting to compare these models using various data sets worldwide. However, the research community is on the way to the conclusive confidence in the emerged models. This paper attempts to use the automotive sector stock prices from National Stock Exchange (NSE), India and analyze them for the intra-sectorial support for stock market decisions. The study identifies the significant variables and their lags which affect the price of the stocks using OLS analysis and decision tree classifiers.Keywords: Indian automotive sector, stock market decisions, equity portfolio analysis, decision tree classifiers, statistical data analysis
Procedia PDF Downloads 48414466 Diversity Indices as a Tool for Evaluating Quality of Water Ways
Authors: Khadra Ahmed, Khaled Kheireldin
Abstract:
In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.Keywords: planktons, diversity indices, water quality index, water ways
Procedia PDF Downloads 51614465 About the Case Portfolio Management Algorithms and Their Applications
Authors: M. Chumburidze, N. Salia, T. Namchevadze
Abstract:
This work deal with case processing problems in business. The task of strategic credit requirements management of cases portfolio is discussed. The information model of credit requirements in a binary tree diagram is considered. The algorithms to solve issues of prioritizing clusters of cases in business have been investigated. An implementation of priority queues to support case management operations has been presented. The corresponding pseudo codes for the programming application have been constructed. The tools applied in this development are based on binary tree ordering algorithms, optimization theory, and business management methods.Keywords: credit network, case portfolio, binary tree, priority queue, stack
Procedia PDF Downloads 14814464 Intelligent Recognition of Diabetes Disease via FCM Based Attribute Weighting
Authors: Kemal Polat
Abstract:
In this paper, an attribute weighting method called fuzzy C-means clustering based attribute weighting (FCMAW) for classification of Diabetes disease dataset has been used. The aims of this study are to reduce the variance within attributes of diabetes dataset and to improve the classification accuracy of classifier algorithm transforming from non-linear separable datasets to linearly separable datasets. Pima Indians Diabetes dataset has two classes including normal subjects (500 instances) and diabetes subjects (268 instances). Fuzzy C-means clustering is an improved version of K-means clustering method and is one of most used clustering methods in data mining and machine learning applications. In this study, as the first stage, fuzzy C-means clustering process has been used for finding the centers of attributes in Pima Indians diabetes dataset and then weighted the dataset according to the ratios of the means of attributes to centers of theirs. Secondly, after weighting process, the classifier algorithms including support vector machine (SVM) and k-NN (k- nearest neighbor) classifiers have been used for classifying weighted Pima Indians diabetes dataset. Experimental results show that the proposed attribute weighting method (FCMAW) has obtained very promising results in the classification of Pima Indians diabetes dataset.Keywords: fuzzy C-means clustering, fuzzy C-means clustering based attribute weighting, Pima Indians diabetes, SVM
Procedia PDF Downloads 41214463 AI-Enhanced Self-Regulated Learning: Proposing a Comprehensive Model with 'Studium' to Meet a Student-Centric Perspective
Authors: Smita Singh
Abstract:
Objective: The Faculty of Chemistry Education at Humboldt University has developed ‘Studium’, a web application designed to enhance long-term self-regulated learning (SRL) and academic achievement. Leveraging advanced generative AI, ‘Studium’ offers a dynamic and adaptive educational experience tailored to individual learning preferences and languages. The application includes evolving tools for personalized notetaking from preferred sources, customizable presentation capabilities, and AI-assisted guidance from academic documents or textbooks. It also features workflow automation and seamless integration with collaborative platforms like Miro, powered by AI. This study aims to propose a model that combines generative AI with traditional features and customization options, empowering students to create personalized learning environments that effectively address the challenges of SRL. Method: To achieve this, the study included graduate and undergraduate students from diverse subject streams, with 15 participants each from Germany and India, ensuring a diverse educational background. An exploratory design was employed using a speed dating method with enactment, where different scenario sessions were created to allow participants to experience various features of ‘Studium’. The session lasted for 50 minutes, providing an in-depth exploration of the platform's capabilities. Participants interacted with Studium’s features via Zoom conferencing and were then engaged in semi-structured interviews lasting 10-15 minutes to gain deeper insights into the effectiveness of ‘Studium’. Additionally, online questionnaire surveys were conducted before and after the session to gather feedback and evaluate satisfaction with self-regulated learning (SRL) after using ‘Studium’. The response rate of this survey was 100%. Results: The findings of this study indicate that students widely acknowledged the positive impact of ‘Studium’ on their learning experience, particularly its adaptability and intuitive design. They expressed a desire for more tools like ‘Studium’ to support self-regulated learning in the future. The application significantly fostered students' independence in organizing information and planning study workflows, which in turn enhanced their confidence in mastering complex concepts. Additionally, ‘Studium’ promoted strategic decision-making and helped students overcome various learning challenges, reinforcing their self-regulation, organization, and motivation skills. Conclusion: This proposed model emphasizes the need for effective integration of personalized AI tools into active learning and SRL environments. By addressing key research questions, our framework aims to demonstrate how AI-assisted platforms like “Studium” can facilitate deeper understanding, maintain student motivation, and support the achievement of academic goals. Thus, our ideal model for AI-assisted educational platforms provides a strategic approach to enhance student's learning experiences and promote their development as self-regulated learners. This proposed model emphasizes the need for effective integration of personalized AI tools into active learning and SRL environments. By addressing key research questions, our framework aims to demonstrate how AI-assisted platforms like ‘Studium’ can facilitate deeper understanding, maintain student motivation, and support the achievement of academic goals. Thus, our ideal model for AI-assisted educational platforms provides a strategic approach to enhance student's learning experiences and promote their development as self-regulated learners.Keywords: self-regulated learning (SRL), generative AI, AI-assisted educational platforms
Procedia PDF Downloads 2814462 Application of Ground Penetrating Radar and Light Falling Weight Deflectometer in Ballast Quality Assessment
Authors: S. Cafiso, B. Capace, A. Di Graziano, C. D’Agostino
Abstract:
Systematic monitoring of the trackbed is necessary to assure safety and quality of service in the railway system. Moreover, to produce effective management of the maintenance treatments, the assessment of bearing capacity of the railway trackbed must include ballast, sub-ballast and subgrade layers at different depths. Consequently, there is an increasing interest in obtaining a consistent measure of ballast bearing capacity with no destructive tests (NDTs) able to work in the physical and time restrictions of railway tracks in operation. Moreover, in the case of the local railway with reduced gauge, the use of the traditional high-speed track monitoring systems is not feasible. In that framework, this paper presents results from in site investigation carried out on ballast and sleepers with Ground Penetrating Radar (GPR) and Light Falling Weight Deflectometer (LWD). These equipment are currently used in road pavement maintenance where they have shown their reliability and effectiveness. Application of such Non-Destructive Tests in railway maintenance is promising but in the early stage of the investigation. More specifically, LWD was used to estimate the stiffness of ballast and sleeper support, as well. LWD, despite the limited load (6 kN in the trial test) applied directly on the sleeper, was able to detect defects in the bearing capacity at the Sleeper/Ballast interface. A dual frequency GPR was applied to detect the presence of layers’ discontinuities at different depths due to fouling phenomena that are the main causes of changing in the layer dielectric proprieties within the ballast thickness. The frequency of 2000Mhz provided high-resolution data to approximately 0.4m depth, while frequency of 600Mhz showed greater depth penetration up to 1.5 m. In the paper literature review and trial in site experience are used to identify Strengths, Weaknesses, Opportunities, and Threats (SWOT analysis) of the application of GPR and LWD for the assessment of bearing capacity of railway track-bed.Keywords: bearing capacity, GPR, LWD, no destructive test, railway track
Procedia PDF Downloads 12614461 The Logistics Equation and Fractal Dimension in Escalators Operations
Authors: Ali Albadri
Abstract:
The logistics equation has never been used or studied in scientific fields outside the field of ecology. It has never been used to understand the behavior of a dynamic system of mechanical machines, like an escalator. We have studied the compatibility of the logistic map against real measurements from an escalator. This study has proven that there is good compatibility between the logistics equation and the experimental measurements. It has discovered the potential of a relationship between the fractal dimension and the non-linearity parameter, R, in the logistics equation. The fractal dimension increases as the R parameter (non-linear parameter) increases. It implies that the fractal dimension increases as the phase of the life span of the machine move from the steady/stable phase to the periodic double phase to a chaotic phase. The fractal dimension and the parameter R can be used as a tool to verify and check the health of machines. We have come up with a theory that there are three areas of behaviors, which they can be classified during the life span of a machine, a steady/stable stage, a periodic double stage, and a chaotic stage. The level of attention to the machine differs depending on the stage that the machine is in. The rate of faults in a machine increases as the machine moves through these three stages. During the double period and the chaotic stages, the number of faults starts to increase and become less predictable. The rate of predictability improves as our monitoring of the changes in the fractal dimension and the parameter R improves. The principles and foundations of our theory in this work have and will have a profound impact on the design of systems, on the way of operation of systems, and on the maintenance schedules of the systems. The systems can be mechanical, electrical, or electronic. The discussed methodology in this paper will give businesses the chance to be more careful at the design stage and planning for maintenance to control costs. The findings in this paper can be implied and used to correlate the three stages of a mechanical system to more in-depth mechanical parameters like wear and fatigue life.Keywords: logistcs map, bifurcation map, fractal dimension, logistics equation
Procedia PDF Downloads 10614460 Evaluation of Clinical Decision Support System in Electronic Medical Record System: A Case of Malawi National Art Electronic Medical Record System
Authors: Pachawo Bisani, Goodall Nyirenda
Abstract:
The Malawi National Antiretroviral Therapy (NART) Electronic Medical Record (EMR) system was designed and developed with guidance from the Ministry of Health through the Department of HIV and AIDS (DHA) with the aim of supporting the management of HIV patient data and reporting in high prevalence ART clinics. As of 2021, the system has been scaled up to over 206 facilities across the country. The system is integrated with the clinical decision support system (CDSS) to assist healthcare providers in making a decision about an individual patient at a particular point in time. Despite NART EMR undergoing several evaluations and assessments, little has been done to evaluate the clinical decision support system in the NART EMR system. Hence, the study aimed to evaluate the use of CDSS in the NART EMR system in Malawi. The study adopted a mixed-method approach, and data was collected through interviews, observations, and questionnaires. The study has revealed that the CDSS tools were integrated into the ART clinic workflow, making it easy for the user to use it. The study has also revealed challenges in system reliability and information accuracy. Despite the challenges, the study further revealed that the system is effective and efficient, and overall, users are satisfied with the system. The study recommends that the implementers focus more on the logic behind the clinical decision-support intervention in order to address some of the concerns and enhance the accuracy of the information supplied. The study further suggests consulting the system's actual users throughout implementation.Keywords: clinical decision support system, electronic medical record system, usability, antiretroviral therapy
Procedia PDF Downloads 9814459 Major Depressive Disorder: Diagnosis based on Electroencephalogram Analysis
Authors: Wajid Mumtaz, Aamir Saeed Malik, Syed Saad Azhar Ali, Mohd Azhar Mohd Yasin
Abstract:
In this paper, a technique based on electroencephalogram (EEG) analysis is presented, aiming for diagnosing major depressive disorder (MDD) among a potential population of MDD patients and healthy controls. EEG is recognized as a clinical modality during applications such as seizure diagnosis, index for anesthesia, detection of brain death or stroke. However, its usability for psychiatric illnesses such as MDD is less studied. Therefore, in this study, for the sake of diagnosis, 2 groups of study participants were recruited, 1) MDD patients, 2) healthy people as controls. EEG data acquired from both groups were analyzed involving inter-hemispheric asymmetry and composite permutation entropy index (CPEI). To automate the process, derived quantities from EEG were utilized as inputs to classifier such as logistic regression (LR) and support vector machine (SVM). The learning of these classification models was tested with a test dataset. Their learning efficiency is provided as accuracy of classifying MDD patients from controls, their sensitivities and specificities were reported, accordingly (LR =81.7 % and SVM =81.5 %). Based on the results, it is concluded that the derived measures are indicators for diagnosing MDD from a potential population of normal controls. In addition, the results motivate further exploring other measures for the same purpose.Keywords: major depressive disorder, diagnosis based on EEG, EEG derived features, CPEI, inter-hemispheric asymmetry
Procedia PDF Downloads 54414458 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments
Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein
Abstract:
Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.Keywords: virtual reality, effective computing, effective VR, emotion-based effective physiological database
Procedia PDF Downloads 23114457 Design and Construction of Temperature and Humidity Control Channel for a Bacteriological Incubator
Authors: Carlos R. Duharte Rodríguez, Ibrain Ceballo Acosta, Carmen B. Busoch Morlán, Angel Regueiro Gómez, Annet Martinez Hernández
Abstract:
This work shows the designing and characterization of a prototype of laboratory incubator as support of research in Microbiology, in particular during studies of bacterial growth in biological samples, with the help of optic methods (Turbidimetry) and electrometric measurements of bioimpedance. It shows the results of simulation and experimentation of the design proposed for the canals of measurement of the variables: temperature and humidity, with a high linearity from the adequate selection of sensors and analogue components of every channel, controlled with help of a microcontroller AT89C51 (ATMEL) with adequate benefits for this type of application.Keywords: microbiology, bacterial growth, incubation station, microorganisms
Procedia PDF Downloads 398