Search results for: enhanced data encryption
26460 Electrochemotherapy of Portal Vein Tumor Thrombus as Dowstaging to Liver Transplantation
Authors: Luciano Tarantino, Emanuele Balzano, Paolo Tarantino, Riccardo Aurelio Nasto, Aurelio Nasto
Abstract:
Liver transplantation (OLT) is contraindicate in Portal Vein tumor Thrombosis (PVTT) from Hepatocellular Carcinoma at hepatic hilum(pH-HCC) Surgery,Thermal ablation and chemotherapy show poorer outcomes Electrochemotherapy (ECT) has been successfully used in patients with pH-HCC with PVTT. We report the results of ECT as downstaging aimed to definitive cure by OLT. F.P. 53 years HBV related Cirrhosis Child-Pugh B7 class; EGDS F2 aesophageal Varices. Diabetes. April 2016 : Enhanced Computed Tomography (CT) detected HCC(n.3 nodules in VII-VIII-VI;diameter range=25 cm) and PVTT of right portal vein. The patient was considered ineligible for OLT. May 2016: first ablation session with percutaneous Radiofrequency-ablation(RFA) of 3 HCC-nodules . August 2016: second ablation session with ECT of PVTT. CT october 2016: disappearance of PVTT and patent right portal vein. No intraparenchymal recurrence. CT march 2017: No recurrence in portal vein and in the left lobe. local recurrence in the VII-VIII segments. May 2017 : transarterial chemoembolization (TACE) of right lobe recurrences. CT October 2017: patent right portal vein. No recurrence. The patient was reconsidered for OLT. He underwent OLT in April 2018. At 36-months follow-up , no intrahepatic recurrence of HCC occurred. March 2021: enhanced CT and PET/CT detected a single small nodule (1.5 cm) uptaking tracer in the left upper pulmonary lobe, no hepatic recurrence . CT-guided FNB showed metastasis from HCC . June 2021: left lung upper lobectomy . At the current time the patient is alive and recurrence-free at 64 months follow-up. ECT Could be aneffective technique as pre-OLT dowstaging in HCC with PVTT.Keywords: liver tumor ablation, interventional ultrasound, electrochemotherapy, liver transplantation
Procedia PDF Downloads 11826459 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26526458 Towards an Enhanced Quality of IPTV Media Server Architecture over Software Defined Networking
Authors: Esmeralda Hysenbelliu
Abstract:
The aim of this paper is to present the QoE (Quality of Experience) IPTV SDN-based media streaming server enhanced architecture for configuring, controlling, management and provisioning the improved delivery of IPTV service application with low cost, low bandwidth, and high security. Furthermore, it is given a virtual QoE IPTV SDN-based topology to provide an improved IPTV service based on QoE Control and Management of multimedia services functionalities. Inside OpenFlow SDN Controller there are enabled in high flexibility and efficiency Service Load-Balancing Systems; based on the Loading-Balance module and based on GeoIP Service. This two Load-balancing system improve IPTV end-users Quality of Experience (QoE) with optimal management of resources greatly. Through the key functionalities of OpenFlow SDN controller, this approach produced several important features, opportunities for overcoming the critical QoE metrics for IPTV Service like achieving incredible Fast Zapping time (Channel Switching time) < 0.1 seconds. This approach enabled Easy and Powerful Transcoding system via FFMPEG encoder. It has the ability to customize streaming dimensions bitrates, latency management and maximum transfer rates ensuring delivering of IPTV streaming services (Audio and Video) in high flexibility, low bandwidth and required performance. This QoE IPTV SDN-based media streaming architecture unlike other architectures provides the possibility of Channel Exchanging between several IPTV service providers all over the word. This new functionality brings many benefits as increasing the number of TV channels received by end –users with low cost, decreasing stream failure time (Channel Failure time < 0.1 seconds) and improving the quality of streaming services.Keywords: improved quality of experience (QoE), OpenFlow SDN controller, IPTV service application, softwarization
Procedia PDF Downloads 14726457 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study
Authors: Zeba Mahmood
Abstract:
The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining
Procedia PDF Downloads 53826456 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 35026455 AIR SAFE: an Internet of Things System for Air Quality Management Leveraging Artificial Intelligence Algorithms
Authors: Mariangela Viviani, Daniele Germano, Simone Colace, Agostino Forestiero, Giuseppe Papuzzo, Sara Laurita
Abstract:
Nowadays, people spend most of their time in closed environments, in offices, or at home. Therefore, secure and highly livable environmental conditions are needed to reduce the probability of aerial viruses spreading. Also, to lower the human impact on the planet, it is important to reduce energy consumption. Heating, Ventilation, and Air Conditioning (HVAC) systems account for the major part of energy consumption in buildings [1]. Devising systems to control and regulate the airflow is, therefore, essential for energy efficiency. Moreover, an optimal setting for thermal comfort and air quality is essential for people’s well-being, at home or in offices, and increases productivity. Thanks to the features of Artificial Intelligence (AI) tools and techniques, it is possible to design innovative systems with: (i) Improved monitoring and prediction accuracy; (ii) Enhanced decision-making and mitigation strategies; (iii) Real-time air quality information; (iv) Increased efficiency in data analysis and processing; (v) Advanced early warning systems for air pollution events; (vi) Automated and cost-effective m onitoring network; and (vii) A better understanding of air quality patterns and trends. We propose AIR SAFE, an IoT-based infrastructure designed to optimize air quality and thermal comfort in indoor environments leveraging AI tools. AIR SAFE employs a network of smart sensors collecting indoor and outdoor data to be analyzed in order to take any corrective measures to ensure the occupants’ wellness. The data are analyzed through AI algorithms able to predict the future levels of temperature, relative humidity, and CO₂ concentration [2]. Based on these predictions, AIR SAFE takes actions, such as opening/closing the window or the air conditioner, to guarantee a high level of thermal comfort and air quality in the environment. In this contribution, we present the results from the AI algorithm we have implemented on the first s et o f d ata c ollected i n a real environment. The results were compared with other models from the literature to validate our approach.Keywords: air quality, internet of things, artificial intelligence, smart home
Procedia PDF Downloads 9326454 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16626453 Reducing the Incidence of Hyperphosphatemia in Patients Receiving Dialysis
Authors: Tsai Su Hui
Abstract:
Background: Hyperphosphatemia in patients receiving dialysis can cause hyperparathyroidism, which can lead to renal osteodystrophy, cardiovascular disease and mortality. Data showed that 26% of patients receiving dialysis had blood phosphate levels of >6.0 mg/dl at this unit from January to March 2017, higher than the Taiwan Society of Nephrology evaluation criteria of < 20%. After analysis, possible reasons included: 1. Incomprehensive education for nurse and lack of relevant training. 2. Insufficient assistive aids for nursing health education instruction. 3. Patients were unsure which foods are high or low in phosphate. 4. Patients did not have habits of taking medicine with them and how to correctly administer the medication. Purpose: To reduce the percentage of patients receiving dialysis with blood phosphate levels of >6.0 mg/dl to less than 20% at this unit. Method: (1) Improve understanding of hyperphosphatemia and food for patients receiving dialysis and their families, (2) Acquire more nursing instruction assistive aids and improve knowledge of hyperphosphatemia for nurse. Results: After implementing the project, the percentage of patients receiving dialysis with blood phosphate levels of >6.0 mg/dl decreased from 26.0% to 18.8% at this unit. By implementing the project, the professional skills of nurse improved, blood phosphate levels of patients receiving dialysis were reduced, and the quality of care for patients receiving dialysis at this unit was enhanced.Keywords: hemodialysis, hyperphosphatemia, incidence, reducing
Procedia PDF Downloads 12626452 Chemical Speciation and Bioavailability of Some Essential Metal Ions In Different Fish Organs at Lake Chamo, Ethiopia
Authors: Adane Gebresilassie Hailemariam, Belete Yilma Hirpaye
Abstract:
The enhanced concentrations of heavy metals, especially in sediments, may indicate human-induced perturbations rather than natural enrichment through geological weathering. Heavy metals are non-biodegradable, persist in the environment, and are concentrated up to the food chain, leading to enhanced levels in the liver and muscle tissues of fishes, aquatic bryophytes, and aquatic biota. Marine organisms, in general fish in particular, accumulate metals to concentrations many times higher than present in water or sediment as they can take up metals in their organs and concentrate at different levels. Thus, metals acquired through the food chain due to pollution are potential chemical hazards, threatening consumers. The Nile tilapia (oreochromic niloticus), catfish (clarius garpinus), and water samples were collected from five sampling sites, namely, inlet-1, inlet-2, center, outlet-1 and outlet-2 of Lake Chamo. The concentration of major and trace metals Na, K, Mg, Ca, Cr, Co, Ni, Mn and Cu in the two fish muscles, gill and liver, was determined using an atomic absorption spectrometer (AAS) and flame photometer (FP). Metal concentrations in the water have also been evaluated within the two consecutive seasons, winter (dry) and spring (wet). The results revealed that the concentration of those metals in Tilapia’s (O. niloticus) muscle, gill, and liver were Na 44.5, 35.1, 28, Mg 2.8, 8.41, 4.61, K 43, 32, 30, Ca 1.5, 6.0, 5.5, Cr 0.91, 1.2, 3.5, Co 3.0, 2.89, 2.62, Ni 0.94, 1.99, 2.2, Mn 1.23, 1.51, 1.6 and Cu 1.1, 1.99, 3.5 mg kg-1 respectively and in catfish’s muscle, gill and liver Na 25, 39, 41.5, Mg 4.8, 2.87, 6, K 29, 38, 40, Ca 2.5, 8.10, 3.0, Cr 0.65, 3.5, 5.0, Co 2.62, 1.86, 1.73, Ni 1.10, 2.3, 3.1, Mn 1.54, 1.57, 1.59 and Cu 1.01, 1.10, 3.70 mg kg-1 respectively. The highest accumulation of Na and K were observed for tilapia muscle and catfish gill, Mg and Ca got higher in tilapia gill and catfish liver, while Co is higher in muscle of the two fish. The Cr, Ni, Mn and Cu levels were higher in the livers of the two fish species. In conculusion, metal toxicity through food chain is the current dangerous issue for human and othe animals. This needs deep focus to promot the health of living animals. The Details of the work are going to be discussed at the conference.Keywords: bioaccumulation, catfish, essential metals, nile tilapia
Procedia PDF Downloads 7826451 Effect of Agricultural Extension Services on Technical Efficiency of Smallholder Cassava Farmers in Ghana: A Stochastic Meta-Frontier Analysis
Authors: Arnold Missiame
Abstract:
In Ghana, rural dwellers who depend primarily on agriculture for their livelihood constitute about 60% of the country’s population. This shows the critical role and potentials of the agricultural sector in helping to achieve Ghana’s vision 2030. With the current threat of climate change and advancements in technology, agricultural extension is not just about technology transfer and improvements in productivity, but it is also about improving the managerial and technical skills of farmers. In Ghana, the government of Ghana as well as other players in the sector like; non-governmental organizations, NGOs, local and international funding agencies, for decades now, have made capacity-building-investments in smallholder farmers by way of extension services delivery. This study sought to compare the technical efficiency of farmers who have access to agricultural extension and farmers who do not in Ghana. The study employed the stochastic meta-frontier model to analyze household survey data comprising 300 smallholder cassava farmers from the Fanteakwa district of Ghana. The farmers were selected through a two-stage sampling technique where 5 communities were purposively selected in the first stage and then 60 smallholder cassava farmers were randomly selected from each of the 5 communities. Semi-structured questionnaires were used to collect data on farmers’ socioeconomic and farm-level characteristics. The results showed that farmers who have access to agricultural extensions services have higher technical efficiencies (TE) and produce much closer to their meta-production frontiers (higher technology gap ratios (TGR) than farmers who do not have access to such extension services. Furthermore, experience in cassava cultivation and formal education significantly improves the technical efficiencies of farmers. The study recommends that the mode and scope of agricultural extension service delivery in the country should be enhanced to ensure that smallholder farmers have easy access to extension agents.Keywords: agricultural extension, Ghana, smallholder farmers, stochastic meta-frontier model, technical efficiency
Procedia PDF Downloads 10826450 Adoption of Big Data by Global Chemical Industries
Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta
Abstract:
The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science
Procedia PDF Downloads 8526449 Multi-Functional Metal Oxides as Gas Sensors, Photo-Catalysts and Bactericides
Authors: Koyar Rane
Abstract:
Nano- to submicron size particles of narrow particle size distribution of semi-conducting TiO₂, ZnO, NiO, CuO, Fe₂O₃ have been synthesized by novel hydrazine method and tested for their gas sensing, photocatalytic and bactericidal activities and the behavior found to be enhanced when the oxides in the thin film forms, that obtained in a specially built spray pyrolysis reactor. Hydrazine method is novel in the sense, say, the UV absorption edge of the white pigment grade wide band gap (~3.2eV) TiO₂ and ZnO shifted to the visible region turning into yellowish particles, indicating modification occurring the band structure. The absorption in the visible region makes these oxides visible light sensitive photocatalysis in degrading pollutants, especially the organic dyes which otherwise increase the chemical oxygen demand of the drinking water, enabling the process feasible not under the harsh energetic UV radiation regime. The electromagnetic radiations on irradiation produce electron-hole pairs Semiconductor + hν → e⁻ + h⁺ The electron-hole pairs thus produced form Reactive Oxygen Species, ROS, on the surface of the semiconductors, O₂(adsorbed)+e⁻ → O₂• - superoxide ion OH-(surface)+h⁺ →•OH - Hydroxyl radical The ROS attack the organic material and micro-organisms. Our antibacterial studies indicate the metal oxides control the Biological Oxygen Demand (BOD) of drinking water which had beyond the safe level normally found in the municipal supply. Metal oxides in the thin film form show overall enhanced properties and the films are reusable. The results of the photodegradation and antibactericidal studies are discussed. Gas sensing studies too have been done to find the versatility of the multifunctional metal oxides.Keywords: hydrazine method, visible light sensitive, photo-degradation of dyes, water/airborne pollutant
Procedia PDF Downloads 16326448 Secure Multiparty Computations for Privacy Preserving Classifiers
Authors: M. Sumana, K. S. Hareesha
Abstract:
Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data
Procedia PDF Downloads 41226447 Geometry, the language of Manifestation of Tabriz School’s Mystical Thoughts in Architecture (Case Study: Dome of Soltanieh)
Authors: Lida Balilan, Dariush Sattarzadeh, Rana Koorepaz
Abstract:
In the Ilkhanid era, the mystical school of Tabriz manifested itself as an art school in various aspects, including miniatures, architecture, urban planning and design, simultaneously with the expansion of the many sciences of its time. In this era, mysticism, both in form and in poetry and prose, as well as in works of art reached its peak. Mysticism, as an inner belief and thought, brought the audience to the artistic and aesthetical view using allegorical and symbolic expression of the religion and had a direct impact on the formation of the intellectual and cultural layers of the society. At the same time, Mystic school of Tabriz could create a symbolic and allegorical language to create magnificent works of architecture with the expansion of science in various fields and using various sciences such as mathematics, geometry, science of numbers and by Abjad letters. In this era, geometry is the middle link between mysticism and architecture and it is divided into two categories, including intellectual and sensory geometry and based on its function. Soltaniyeh dome is one of the prominent buildings of the Tabriz school with the shrine land use. In this article, information is collected using a historical-interpretive method and the results are analyzed using an analytical-comparative method. The results of the study suggest that the designers and builders of the Soltaniyeh dome have used shapes, colors, numbers, letters and words in the form of motifs, geometric patterns as well as lines and writings in levels and layers ranging from plans to decorations and arrays for architectural symbolization and encryption to express and transmit mystical ideas.Keywords: geometry, Tabriz school, mystical thoughts, dome of Soltaniyeh
Procedia PDF Downloads 8626446 Incorporating Spatial Selection Criteria with Decision-Maker Preferences of A Precast Manufacturing Plant
Authors: M. N. A. Azman, M. S. S. Ahamad
Abstract:
The Construction Industry Development Board of Malaysia has been actively promoting the use of precast manufacturing in the local construction industry over the last decade. In an era of rapid technological changes, precast manufacturing significantly contributes to improving construction activities and ensuring sustainable economic growth. Current studies on the location decision of precast manufacturing plants aimed to enhanced local economic development are scarce. To address this gap, the present research establishes a new set of spatial criteria, such as attribute maps and preference weights, derived from a survey of local industry decision makers. These data represent the input parameters for the MCE-GIS site selection model, for which the weighted linear combination method is used. Verification tests on the model were conducted to determine the potential precast manufacturing sites in the state of Penang, Malaysia. The tests yield a predicted area of 12.87 acres located within a designated industrial zone. Although, the model is developed specifically for precast manufacturing plant but nevertheless it can be employed to other types of industries by following the methodology and guidelines proposed in the present research.Keywords: geographical information system, multi criteria evaluation, industrialised building system, civil engineering
Procedia PDF Downloads 28726445 Effect of the Deposition Time of Hydrogenated Nanocrystalline Si Grown on Porous Alumina Film on Glass Substrate by Plasma Processing Chemical Vapor Deposition
Authors: F. Laatar, S. Ktifa, H. Ezzaouia
Abstract:
Plasma Enhanced Chemical Vapor Deposition (PECVD) method is used to deposit hydrogenated nanocrystalline silicon films (nc-Si: H) on Porous Anodic Alumina Films (PAF) on glass substrate at different deposition duration. Influence of the deposition time on the physical properties of nc-Si: H grown on PAF was investigated through an extensive correlation between micro-structural and optical properties of these films. In this paper, we present an extensive study of the morphological, structural and optical properties of these films by Atomic Force Microscopy (AFM), X-Ray Diffraction (XRD) techniques and a UV-Vis-NIR spectrometer. It was found that the changes in DT can modify the films thickness, the surface roughness and eventually improve the optical properties of the composite. Optical properties (optical thicknesses, refractive indexes (n), absorption coefficients (α), extinction coefficients (k), and the values of the optical transitions EG) of this kind of samples were obtained using the data of the transmittance T and reflectance R spectra’s recorded by the UV–Vis–NIR spectrometer. We used Cauchy and Wemple–DiDomenico models for the analysis of the dispersion of the refractive index and the determination of the optical properties of these films.Keywords: hydragenated nanocrystalline silicon, plasma processing chemical vapor deposition, X-ray diffraction, optical properties
Procedia PDF Downloads 37726444 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 34426443 Understanding the Impact of Resilience Training on Cognitive Performance in Military Personnel
Authors: Haji Mohammad Zulfan Farhi Bin Haji Sulaini, Mohammad Azeezudde’en Bin Mohd Ismaon
Abstract:
The demands placed on military athletes extend beyond physical prowess to encompass cognitive resilience in high-stress environments. This study investigates the effects of resilience training on the cognitive performance of military athletes, shedding light on the potential benefits and implications for optimizing their overall readiness. In a rapidly evolving global landscape, armed forces worldwide are recognizing the importance of cognitive resilience alongside physical fitness. The study employs a mixed-methods approach, incorporating quantitative cognitive assessments and qualitative data from military athletes undergoing resilience training programs. Cognitive performance is evaluated through a battery of tests, including measures of memory, attention, decision-making, and reaction time. The participants, drawn from various branches of the military, are divided into experimental and control groups. The experimental group undergoes a comprehensive resilience training program, while the control group receives traditional physical training without a specific focus on resilience. The initial findings indicate a substantial improvement in cognitive performance among military athletes who have undergone resilience training. These improvements are particularly evident in domains such as attention and decision-making. The experimental group demonstrated enhanced situational awareness, quicker problem-solving abilities, and increased adaptability in high-stress scenarios. These results suggest that resilience training not only bolsters mental toughness but also positively impacts cognitive skills critical to military operations. In addition to quantitative assessments, qualitative data is collected through interviews and surveys to gain insights into the subjective experiences of military athletes. Preliminary analysis of these narratives reveals that participants in the resilience training program report higher levels of self-confidence, emotional regulation, and an improved ability to manage stress. These psychological attributes contribute to their enhanced cognitive performance and overall readiness. Moreover, this study explores the potential long-term benefits of resilience training. By tracking participants over an extended period, we aim to assess the durability of cognitive improvements and their effects on overall mission success. Early results suggest that resilience training may serve as a protective factor against the detrimental effects of prolonged exposure to stressors, potentially reducing the risk of burnout and psychological trauma among military athletes. This research has significant implications for military organizations seeking to optimize the performance and well-being of their personnel. The findings suggest that integrating resilience training into the training regimen of military athletes can lead to a more resilient and cognitively capable force. This, in turn, may enhance mission success, reduce the risk of injuries, and improve the overall effectiveness of military operations. In conclusion, this study provides compelling evidence that resilience training positively impacts the cognitive performance of military athletes. The preliminary results indicate improvements in attention, decision-making, and adaptability, as well as increased psychological resilience. As the study progresses and incorporates long-term follow-ups, it is expected to provide valuable insights into the enduring effects of resilience training on the cognitive readiness of military athletes, contributing to the ongoing efforts to optimize military personnel's physical and mental capabilities in the face of ever-evolving challenges.Keywords: military athletes, cognitive performance, resilience training, cognitive enhancement program
Procedia PDF Downloads 8026442 The Fusion of Blockchain and AI in Supply Chain Finance: Scalability in Distributed Systems
Authors: Wu You, Burra Venkata Durga Kumar
Abstract:
This study examines the promising potential of integrating Blockchain and Artificial Intelligence (AI) technologies to scalability in Distributed Systems within the field of supply chain finance. The finance industry is continually confronted with scalability challenges in its Distributed Systems, particularly within the supply chain finance sector, impacting efficiency and security. Blockchain, with its inherent attributes of high scalability and secure distributed ledger system, coupled with AI's strengths in optimizing data processing and decision-making, holds the key to innovating the industry's approach to these issues. This study elucidates the synergistic interplay between Blockchain and AI, detailing how their fusion can drive a significant transformation in the supply chain finance sector's Distributed Systems. It offers specific use-cases within this field to illustrate the practical implications and potential benefits of this technological convergence. The study also discusses future possibilities and current challenges in implementing this groundbreaking approach within the context of supply chain finance. It concludes that the intersection of Blockchain and AI could ignite a new epoch of enhanced efficiency, security, and transparency in the Distributed Systems of supply chain finance within the financial industry.Keywords: blockchain, artificial intelligence (AI), scaled distributed systems, supply chain finance, efficiency and security
Procedia PDF Downloads 9326441 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features
Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova
Abstract:
The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.Keywords: emotion recognition, facial recognition, signal processing, machine learning
Procedia PDF Downloads 31626440 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels
Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur
Abstract:
With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography
Procedia PDF Downloads 12426439 Cooperative Agents to Prevent and Mitigate Distributed Denial of Service Attacks of Internet of Things Devices in Transportation Systems
Authors: Borhan Marzougui
Abstract:
Road and Transport Authority (RTA) is moving ahead with the implementation of the leader’s vision in exploring all avenues that may bring better security and safety services to the community. Smart transport means using smart technologies such as IoT (Internet of Things). This technology continues to affirm its important role in the context of Information and Transportation Systems. In fact, IoT is a network of Internet-connected objects able to collect and exchange different data using embedded sensors. With the growth of IoT, Distributed Denial of Service (DDoS) attacks is also growing exponentially. DDoS attacks are the major and a real threat to various transportation services. Currently, the defense mechanisms are mainly passive in nature, and there is a need to develop a smart technique to handle them. In fact, new IoT devices are being used into a botnet for DDoS attackers to accumulate for attacker purposes. The aim of this paper is to provide a relevant understanding of dangerous types of DDoS attack related to IoT and to provide valuable guidance for the future IoT security method. Our methodology is based on development of the distributed algorithm. This algorithm manipulates dedicated intelligent and cooperative agents to prevent and to mitigate DDOS attacks. The proposed technique ensure a preventive action when a malicious packets start to be distributed through the connected node (Network of IoT devices). In addition, the devices such as camera and radio frequency identification (RFID) are connected within the secured network, and the data generated by it are analyzed in real time by intelligent and cooperative agents. The proposed security system is based on a multi-agent system. The obtained result has shown a significant reduction of a number of infected devices and enhanced the capabilities of different security dispositives.Keywords: IoT, DDoS, attacks, botnet, security, agents
Procedia PDF Downloads 14326438 Using Immersive Study Abroad Experiences to Strengthen Preservice Teachers’ Critical Reflection Skills on Future Classroom Practices
Authors: Meredith Jones, Susan Catapano, Carol McNulty
Abstract:
Study abroad experiences create unique learning opportunities for preservice teachers to strengthen their reflective thinking practices through applied learning experiences. Not only do study abroad experiences provide opportunities for students to expand their cultural sensitivity, but incorporating applied learning experiences in study abroad trips creates unique opportunities for preservice teachers to engage in critical reflection on their teaching skills. Applied learning experiences are designed to nurture learning and growth through a reflective, experiential process outside the traditional classroom setting. As students participate in applied learning experiences, they engage in critical reflection independently, with their peers, and with university faculty. Critical reflection within applied learning contexts generates, deepens, and documents learning but must be intentionally designed to be effective. Grounded in Dewey’s model of reflection, this qualitative study examines longitudinal data from various study abroad cohorts from a particular university. Reflective data was collected during the study abroad trip, and follow up data on critical reflection of teaching practices were collected six months and a year after the trip. Dewey’s model of reflection requires preservice teachers to make sense of their experiences by reflecting on theoretical knowledge, experiences, and pedagogical knowledge. Guided reflection provides preservice teachers with a framework to respond to questions and ideas critical to the applied learning outcomes. Prompts are used to engage preservice teachers in reflecting on situations they have experienced and how they can be transferred to their teaching. Findings from this study noted that students with previous field experiences, or work in the field, engaged in more critical reflection on pedagogical knowledge throughout their applied learning experience. Preservice teachers with limited experiences in the field benefited from engaging in critical reflection prompted by university faculty during the applied learning experience. However, they were able to independently engage in critical reflection once they began work in the field through university field placements, internships, or student teaching. Finally, students who participated in study abroad applied learning experiences reported their critical reflection on their teaching practices, and cultural sensitivity enhanced their teaching and relationships with children once they formally entered the teaching profession.Keywords: applied learning experiences, critical reflection, cultural sensitivity, preservice teachers, teacher education
Procedia PDF Downloads 13826437 Mechanistic Structural Insights into the UV Induced Apoptosis via Bcl-2 proteins
Authors: Akash Bera, Suraj Singh, Jacinta Dsouza, Ramakrishna V. Hosur, Pushpa Mishra
Abstract:
Ultraviolet C (UVC) radiation induces apoptosis in mammalian cells and it is suggested that the mechanism by which this occurs is the mitochondrial pathway of apoptosis through the release of cytochrome c from the mitochondria into the cytosol. The Bcl-2 family of proteins pro-and anti-apoptotic is the regulators of the mitochondrial pathway of apoptosis. Upon UVC irradiation, the proliferation of apoptosis is enhanced through the downregulation of the anti-apoptotic protein Bcl-xl and up-regulation of Bax. Although the participation of the Bcl-2 family of proteins in apoptosis appears responsive to UVC radiation, to the author's best knowledge, it is unknown how the structure and, effectively, the function of these proteins are directly impacted by UVC exposure. In this background, we present here a structural rationale for the effect of UVC irradiation in restoring apoptosis using two of the relevant proteins, namely, Bid-FL and Bcl-xl ΔC, whose solution structures have been reported previously. Using a variety of biophysical tools such as circular dichroism, fluorescence and NMR spectroscopy, we show that following UVC irradiation, the structures of Bcl-xlΔC and Bid-FL are irreversibly altered. Bcl-xLΔC is found to be more sensitive to UV exposure than Bid-FL. From the NMR data, dramatic structural perturbations (α-helix to β-sheet) are seen to occur in the BH3 binding region, a crucial segment of Bcl-xlΔC which impacts the efficacy of its interactions with pro-apoptotic tBid. These results explain the regulation of apoptosis by UVC irradiation. Our results on irradiation dosage dependence of the structural changes have therapeutic potential for the treatment of cancer.Keywords: Bid, Bcl-xl, UVC, apoptosis
Procedia PDF Downloads 12726436 Data Recording for Remote Monitoring of Autonomous Vehicles
Authors: Rong-Terng Juang
Abstract:
Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar
Procedia PDF Downloads 16326435 Multi-Label Approach to Facilitate Test Automation Based on Historical Data
Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally
Abstract:
The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.Keywords: machine learning, multi-class, multi-label, supervised learning, test automation
Procedia PDF Downloads 13226434 Collaborative Data Refinement for Enhanced Ionic Conductivity Prediction in Garnet-Type Materials
Authors: Zakaria Kharbouch, Mustapha Bouchaara, F. Elkouihen, A. Habbal, A. Ratnani, A. Faik
Abstract:
Solid-state lithium-ion batteries have garnered increasing interest in modern energy research due to their potential for safer, more efficient, and sustainable energy storage systems. Among the critical components of these batteries, the electrolyte plays a pivotal role, with LLZO garnet-based electrolytes showing significant promise. Garnet materials offer intrinsic advantages such as high Li-ion conductivity, wide electrochemical stability, and excellent compatibility with lithium metal anodes. However, optimizing ionic conductivity in garnet structures poses a complex challenge, primarily due to the multitude of potential dopants that can be incorporated into the LLZO crystal lattice. The complexity of material design, influenced by numerous dopant options, requires a systematic method to find the most effective combinations. This study highlights the utility of machine learning (ML) techniques in the materials discovery process to navigate the complex range of factors in garnet-based electrolytes. Collaborators from the materials science and ML fields worked with a comprehensive dataset previously employed in a similar study and collected from various literature sources. This dataset served as the foundation for an extensive data refinement phase, where meticulous error identification, correction, outlier removal, and garnet-specific feature engineering were conducted. This rigorous process substantially improved the dataset's quality, ensuring it accurately captured the underlying physical and chemical principles governing garnet ionic conductivity. The data refinement effort resulted in a significant improvement in the predictive performance of the machine learning model. Originally starting at an accuracy of 0.32, the model underwent substantial refinement, ultimately achieving an accuracy of 0.88. This enhancement highlights the effectiveness of the interdisciplinary approach and underscores the substantial potential of machine learning techniques in materials science research.Keywords: lithium batteries, all-solid-state batteries, machine learning, solid state electrolytes
Procedia PDF Downloads 6126433 Application Programming Interface Security in Embedded and Open Finance
Authors: Andrew John Zeller, Artjoms Formulevics
Abstract:
Banking and financial services are rapidly transitioning from being monolithic structures focusing merely on their own financial offerings to becoming integrated players in multiple customer journeys and supply chains. Banks themselves are refocusing on being liquidity providers and underwriters in these networks, while the general concept of ‘embeddedness’ builds on the market readily available API (Application Programming Interface) architectures to flexibly deliver services to various requestors, i.e., online retailers who need finance and insurance products to better serve their customers, respectively. With this new flexibility come new requirements for enhanced cybersecurity. API structures are more decentralized and inherently prone to change. Unfortunately, this has not been comprehensively addressed in the literature. This paper tries to fill this gap by looking at security approaches and technologies relevant to API architectures found in embedded finance. After presenting the research methodology applied and introducing the major bodies of knowledge involved, the paper will discuss six dominating technology trends shaping high-level financial services architectures. Subsequently, embedded finance and the respective usage of API strategies will be described. Building on this, security considerations for APIs in financial and insurance services will be elaborated on before concluding with some ideas for possible further research.Keywords: embedded finance, embedded banking strategy, cybersecurity, API management, data security, cybersecurity, IT management
Procedia PDF Downloads 4226432 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory
Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan
Abstract:
Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.Keywords: data fusion, Dempster-Shafer theory, data mining, event detection
Procedia PDF Downloads 41026431 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016
Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi
Abstract:
This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.Keywords: big health data, data subject rights, GDPR, pandemic
Procedia PDF Downloads 129