Search results for: artificial intelligence in semiconductor manufacturing
2212 Facial Emotion Recognition Using Deep Learning
Authors: Ashutosh Mishra, Nikhil Goyal
Abstract:
A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations.Keywords: facial recognition, computational intelligence, convolutional neural network, depth map
Procedia PDF Downloads 2312211 Intelligent Prediction System for Diagnosis of Heart Attack
Authors: Oluwaponmile David Alao
Abstract:
Due to an increase in the death rate as a result of heart attack. There is need to develop a system that can be useful in the diagnosis of the disease at the medical centre. This system will help in preventing misdiagnosis that may occur from the medical practitioner or the physicians. In this research work, heart disease dataset obtained from UCI repository has been used to develop an intelligent prediction diagnosis system. The system is modeled on a feedforwad neural network and trained with back propagation neural network. A recognition rate of 86% is obtained from the testing of the network.Keywords: heart disease, artificial neural network, diagnosis, prediction system
Procedia PDF Downloads 4502210 Best Resource Recommendation for a Stochastic Process
Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa
Abstract:
The aim of this study was to develop an Artificial Neural Network0 s recommendation model for an online process using the complexity of load, performance, and average servicing time of the resources. Here, the proposed model investigates the resource performance using stochastic gradient decent method for learning ranking function. A probabilistic cost function is implemented to identify the optimal θ values (load) on each resource. Based on this result the recommendation of resource suitable for performing the currently executing task is made. The test result of CoSeLoG project is presented with an accuracy of 72.856%.Keywords: ADALINE, neural network, gradient decent, process mining, resource behaviour, polynomial regression model
Procedia PDF Downloads 3902209 Artificial Intelligence Based Method in Identifying Tumour Infiltrating Lymphocytes of Triple Negative Breast Cancer
Authors: Nurkhairul Bariyah Baharun, Afzan Adam, Reena Rahayu Md Zin
Abstract:
Tumor microenvironment (TME) in breast cancer is mainly composed of cancer cells, immune cells, and stromal cells. The interaction between cancer cells and their microenvironment plays an important role in tumor development, progression, and treatment response. The TME in breast cancer includes tumor-infiltrating lymphocytes (TILs) that are implicated in killing tumor cells. TILs can be found in tumor stroma (sTILs) and within the tumor (iTILs). TILs in triple negative breast cancer (TNBC) have been demonstrated to have prognostic and potentially predictive value. The international Immune-Oncology Biomarker Working Group (TIL-WG) had developed a guideline focus on the assessment of sTILs using hematoxylin and eosin (H&E)-stained slides. According to the guideline, the pathologists use “eye balling” method on the H&E stained- slide for sTILs assessment. This method has low precision, poor interobserver reproducibility, and is time-consuming for a comprehensive evaluation, besides only counted sTILs in their assessment. The TIL-WG has therefore recommended that any algorithm for computational assessment of TILs utilizing the guidelines provided to overcome the limitations of manual assessment, thus providing highly accurate and reliable TILs detection and classification for reproducible and quantitative measurement. This study is carried out to develop a TNBC digital whole slide image (WSI) dataset from H&E-stained slides and IHC (CD4+ and CD8+) stained slides. TNBC cases were retrieved from the database of the Department of Pathology, Hospital Canselor Tuanku Muhriz (HCTM). TNBC cases diagnosed between the year 2010 and 2021 with no history of other cancer and available block tissue were included in the study (n=58). Tissue blocks were sectioned approximately 4 µm for H&E and IHC stain. The H&E staining was performed according to a well-established protocol. Indirect IHC stain was also performed on the tissue sections using protocol from Diagnostic BioSystems PolyVue™ Plus Kit, USA. The slides were stained with rabbit monoclonal, CD8 antibody (SP16) and Rabbit monoclonal, CD4 antibody (EP204). The selected and quality-checked slides were then scanned using a high-resolution whole slide scanner (Pannoramic DESK II DW- slide scanner) to digitalize the tissue image with a pixel resolution of 20x magnification. A manual TILs (sTILs and iTILs) assessment was then carried out by the appointed pathologist (2 pathologists) for manual TILs scoring from the digital WSIs following the guideline developed by TIL-WG 2014, and the result displayed as the percentage of sTILs and iTILs per mm² stromal and tumour area on the tissue. Following this, we aimed to develop an automated digital image scoring framework that incorporates key elements of manual guidelines (including both sTILs and iTILs) using manually annotated data for robust and objective quantification of TILs in TNBC. From the study, we have developed a digital dataset of TNBC H&E and IHC (CD4+ and CD8+) stained slides. We hope that an automated based scoring method can provide quantitative and interpretable TILs scoring, which correlates with the manual pathologist-derived sTILs and iTILs scoring and thus has potential prognostic implications.Keywords: automated quantification, digital pathology, triple negative breast cancer, tumour infiltrating lymphocytes
Procedia PDF Downloads 1162208 A Non-Invasive Blood Glucose Monitoring System Using near-Infrared Spectroscopy with Remote Data Logging
Authors: Bodhayan Nandi, Shubhajit Roy Chowdhury
Abstract:
This paper presents the development of a portable blood glucose monitoring device based on Near-Infrared Spectroscopy. The system supports Internet connectivity through WiFi and uploads the time series data of glucose concentration of patients to a server. In addition, the server is given sufficient intelligence to predict the future pathophysiological state of a patient given the current and past pathophysiological data. This will enable to prognosticate the approaching critical condition of the patient much before the critical condition actually occurs.The server hosts web applications to allow authorized users to monitor the data remotely.Keywords: non invasive, blood glucose concentration, microcontroller, IoT, application server, database server
Procedia PDF Downloads 2202207 Anthropometric Profile as a Factor of Impact on Employee Productivity in Manufacturing Industry of Tijuana, Mexico
Authors: J. A. López, J. E. Olguín, C. W. Camargo, G. A. Quijano, R. Martínez
Abstract:
This paper presents an anthropometric study conducted to 300 employees in a maquiladora industry that belongs to the cluster of medical products as part of a research project to pretend simulate workplace conditions under which operators conduct their activities. This project is relevant because traditionally performed a study to design ergonomic workspaces according to anthropometric profile of users, however, this paper demonstrates the importance of making decisions when the infrastructure cannot be adapted for economic whichever put emphasis on user activity.Keywords: anthropometry, biomechanics, design, ergonomics, productivity
Procedia PDF Downloads 4592206 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics
Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir
Abstract:
Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone
Procedia PDF Downloads 1932205 Immuno-Modulatory Role of Weeds in Feeds of Cyprinus Carpio
Authors: Vipin Kumar Verma, Neeta Sehgal, Om Prakash
Abstract:
Cyprinus carpio has a wide spread occurrence in the lakes and rivers of Europe and Asia. Heavy losses in natural environment due to anthropogenic activities, including pollution as well as pathogenic diseases have landed this fish in IUCN red list of vulnerable species. The significance of a suitable diet in preserving the health status of fish is widely recognized. In present study, artificial feed supplemented with leaves of two weed plants, Eichhornia crassipes and Ricinus communis were evaluated for their role on the fish immune system. To achieve this objective fish were acclimatized to laboratory conditions (25 ± 1 °C; 12 L: 12D) for 10 days prior to start of experiment and divided into 4 groups: non-challenged (negative control= A), challenged [positive control (B) and experimental (C & D)]. Group A, B were fed with non-supplemented feed while group C & D were fed with feed supplemented with 5% Eichhornia crassipes and 5% Ricinus communis respectively. Supplemented feeds were evaluated for their effect on growth, health, immune system and disease resistance in fish when challenged with Vibrio harveyi. Fingerlings of C. carpio (weight, 2.0±0.5 g) were exposed with fresh overnight culture of V. harveyi through bath immunization (concentration 2 Χ 105) for 2 hours on 10 days interval for 40 days. The growth was monitored through increase in their relative weight. The rate of mortality due to bacterial infection as well as due to effect of feed was recorded accordingly. Immune response of fish was analyzed through differential leucocyte count, percentage phagocytosis and phagocytic index. The effect of V. harveyi on fish organs were examined through histo-pathological examination of internal organs like spleen, liver and kidney. The change in the immune response was also observed through gene expression analysis. The antioxidant potential of plant extracts was measured through DPPH and FRAP assay and amount of total phenols and flavonoids were calculates through biochemical analysis. The chemical composition of plant’s methanol extracts was determined by GC-MS analysis, which showed presence of various secondary metabolites and other compounds. Investigation revealed immuno-modulatory effect of plants, when supplemented with the artificial feed of fish.Keywords: immuno-modulation, gc-ms, Cyprinus carpio, Eichhornia crassipes, Ricinus communis
Procedia PDF Downloads 4912204 Transformational Leadership in the United States to Negate Current Ethnocentrisms
Authors: Molly Meadows
Abstract:
Following the presidency of Donald J. Trump, Americans have become hyperaware of ethnocentrisms that plague the culture. The president's egoist ethics encouraged a divide between what the citizens of the US identified as just or unjust. In the race for global supremacy and leading ideology, fears have arisen, exacerbated by the ethnocentricity of the country's leader, pointing to the possible harmful ethical standards of competing nations. Due to the concept of ethical absolutism, an international code of ethics would not be possible, and the changes needed to eliminate the stigma surrounding other cultures of thought would need to come from the governing body of the US. As the current leading global ideology, the US would need its government to embody a transformational leadership style in order to unite the motivations of the citizens and encourage intercultural tolerance.Keywords: ethics, transformational leadership, American politics, egoism, cultural intelligence, ethical relativism
Procedia PDF Downloads 952203 Identification of Suitable Sites for Rainwater Harvesting in Salt Water Intruded Area by Using Geospatial Techniques in Jafrabad, Amreli District, India
Authors: Pandurang Balwant, Ashutosh Mishra, Jyothi V., Abhay Soni, Padmakar C., Rafat Quamar, Ramesh J.
Abstract:
The sea water intrusion in the coastal aquifers has become one of the major environmental concerns. Although, it is a natural phenomenon but, it can be induced with anthropogenic activities like excessive exploitation of groundwater, seacoast mining, etc. The geological and hydrogeological conditions including groundwater heads and groundwater pumping pattern in the coastal areas also influence the magnitude of seawater intrusion. However, this problem can be remediated by taking some preventive measures like rainwater harvesting and artificial recharge. The present study is an attempt to identify suitable sites for rainwater harvesting in salt intrusion affected area near coastal aquifer of Jafrabad town, Amreli district, Gujrat, India. The physico-chemical water quality results show that out of 25 groundwater samples collected from the study area most of samples were found to contain high concentration of Total Dissolved Solids (TDS) with major fractions of Na and Cl ions. The Cl/HCO3 ratio was also found greater than 1 which indicates the salt water contamination in the study area. The geophysical survey was conducted at nine sites within the study area to explore the extent of contamination of sea water. From the inverted resistivity sections, low resistivity zone (<3 Ohm m) associated with seawater contamination were demarcated in North block pit and south block pit of NCJW mines, Mitiyala village Lotpur and Lunsapur village at the depth of 33 m, 12 m, 40 m, 37 m, 24 m respectively. Geospatial techniques in combination of Analytical Hierarchy Process (AHP) considering hydrogeological factors, geographical features, drainage pattern, water quality and geophysical results for the study area were exploited to identify potential zones for the Rainwater Harvesting. Rainwater harvesting suitability model was developed in ArcGIS 10.1 software and Rainwater harvesting suitability map for the study area was generated. AHP in combination of the weighted overlay analysis is an appropriate method to identify rainwater harvesting potential zones. The suitability map can be further utilized as a guidance map for the development of rainwater harvesting infrastructures in the study area for either artificial groundwater recharge facilities or for direct use of harvested rainwater.Keywords: analytical hierarchy process, groundwater quality, rainwater harvesting, seawater intrusion
Procedia PDF Downloads 1742202 Next Generation of Tunnel Field Effect Transistor: NCTFET
Authors: Naima Guenifi, Shiromani Balmukund Rahi, Amina Bechka
Abstract:
Tunnel FET is one of the most suitable alternatives FET devices for conventional CMOS technology for low-power electronics and applications. Due to its lower subthreshold swing (SS) value, it is a strong follower of low power applications. It is a quantum FET device that follows the band to band (B2B) tunneling transport phenomena of charge carriers. Due to band to band tunneling, tunnel FET is suffering from a lower switching current than conventional metal-oxide-semiconductor field-effect transistor (MOSFET). For improvement of device features and limitations, the newly invented negative capacitance concept of ferroelectric material is implemented in conventional Tunnel FET structure popularly known as NC TFET. The present research work has implemented the idea of high-k gate dielectric added with ferroelectric material on double gate Tunnel FET for implementation of negative capacitance. It has been observed that the idea of negative capacitance further improves device features like SS value. It helps to reduce power dissipation and switching energy. An extensive investigation for circularity uses for digital, analog/RF and linearity features of double gate NCTFET have been adopted here for research work. Several essential designs paraments for analog/RF and linearity parameters like transconductance(gm), transconductance generation factor (gm/IDS), its high-order derivatives (gm2, gm3), cut-off frequency (fT), gain-bandwidth product (GBW), transconductance generation factor (gm/IDS) has been investigated for low power RF applications. The VIP₂, VIP₃, IMD₃, IIP₃, distortion characteristics (HD2, HD3), 1-dB, the compression point, delay and power delay product performance have also been thoroughly studied.Keywords: analog/digital, ferroelectric, linearity, negative capacitance, Tunnel FET, transconductance
Procedia PDF Downloads 1952201 An Easy Approach for Fabrication of Macroporous Apatite-Based Bone Cement Used As Potential Trabecular Bone Substitute
Authors: Vimal Kumar Dewangan, T. S. Sampath Kumar, Mukesh Doble, Viju Daniel Varghese
Abstract:
The apatite-based, i.e., calcium-deficient hydroxyapatite (CDHAp) bone cement is well-known potential bone graft/substitute in orthopaedics due to its similar chemical composition with natural bone minerals. Therefore, an easy approach was attempted to fabricate the apatite-based (CDHAp) bone cement with improved injectability, bioresorbability, and macroporosity. In this study, the desired bone cement was developed by mixing the solid phase (consisting of wet chemically synthesized nanocrystalline hydroxyapatite and commercially available (synthetic) tricalcium phosphate) and the liquid phase (consisting of cement binding accelerator with few biopolymers in a dilute acidic solution) along with a liquid porogen as polysorbate or a solid porogen as mannitol (for comparison) in an optimized liquid-to-powder ratio. The fabricated cement sets within clinically preferred setting time (≤20 minutes) are better injectable (>70%) and also stable at ~7.3-7.4 (physiological pH). The CDHAp phased bone cement was resulted by immersing the fabricated after-set cement in phosphate buffer solution and other similar artificial body fluids and incubated at physiological conditions for seven days, confirmed through the X-ray diffraction and Fourier transform-infrared spectroscopy analyses. The so-formed synthetic apatite-based bone cement holds the acceptable compressive strength (within the range of trabecular bone) with average interconnected pores size falls in a macropores range (~50-200μm) inside the cement, verified by scanning electron microscopy (SEM), mercury intrusion porosimetry and micro-CT analysis techniques. Also, it is biodegradable (degrades ~19-22% within 10-12 weeks) when incubated in artificial body fluids under physiological conditions. The biocompatibility study of the bone cement, when incubated with MG63 cells, shows a significant increase in the cell viability after 3rd day of incubation compared with the control, and the cells were well-attached and spread completely on the surface of the bone cement, confirmed through SEM and fluorescence microscopy analyses. With this all, we can conclude that the developed synthetic macroporous apatite-based bone cement may have the potential to become promising material used as a trabecular bone substitute.Keywords: calcium deficient hydroxyapatite, synthetic apatite-based bone cement, injectability, macroporosity, trabecular bone substitute
Procedia PDF Downloads 872200 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine
Authors: Adriana Haulica
Abstract:
Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics
Procedia PDF Downloads 702199 Scope of Rainwater Harvesting in Residential Plots of Dhaka City
Authors: Jubaida Gulshan Ara, Zebun Nasreen Ahmed
Abstract:
Urban flood and drought has been a major problem of Dhaka city, particularly in recent years. Continuous increase of the city built up area, and limiting rainwater infiltration zone, are thought to be the main causes of the problem. Proper rainwater management, even at the individual plot level, might bring significant improvement in this regard. As residential use pattern occupies a significant portion of the city surface, the scope of rainwater harvesting (RWH) in residential buildings can be investigated. This paper reports on a research which explored the scope of rainwater harvesting in residential plots, with multifamily apartment buildings, in Dhaka city. The research investigated the basics of RWH, contextual information, i.e., hydro-geological, meteorological data of Dhaka city and the rules and legislations for residential building construction. The study also explored contemporary rainwater harvesting practices in the local and international contexts. On the basis of theoretical understanding, 21 sample case-studies, in different phases of construction, were selected from seven different categories of plot sizes, in different residential areas of Dhaka city. Primary data from the 21 case-study buildings were collected from a physical survey, from design drawings, accompanied by a questionnaire survey. All necessary secondary data were gathered from published and other relevant sources. Collected primary and secondary data were used to calculate and analyze the RWH needs for each case study, based on the theoretical understanding. The main findings have been compiled and compared, to observe residential development trends with regards to building rainwater harvesting system. The study has found that, in ‘Multifamily Apartment Building’ of Dhaka city, storage, and recharge structure size for rainwater harvesting, increases along with occupants’ number, and with the increasing size of the plot. Hence, demand vs. supply ratio remains almost the same for different sizes of plots, and consequently, the size of the storage structure increases significantly, in large-scale plots. It has been found that rainwater can meet only 12%-30% of the total restricted water demand of these residential buildings of Dhaka city. Therefore, artificial groundwater recharge might be the more suitable option for RWH, than storage. The study came up with this conclusion that, in multifamily residential apartments of Dhaka city, artificial groundwater recharge might be the more suitable option for RWH, than storing the rainwater on site.Keywords: Dhaka city, rainwater harvesting, residential plots, urban flood
Procedia PDF Downloads 1952198 Market Index Trend Prediction using Deep Learning and Risk Analysis
Authors: Shervin Alaei, Reza Moradi
Abstract:
Trading in financial markets is subject to risks due to their high volatilities. Here, using an LSTM neural network, and by doing some risk-based feature engineering tasks, we developed a method that can accurately predict trends of the Tehran stock exchange market index from a few days ago. Our test results have shown that the proposed method with an average prediction accuracy of more than 94% is superior to the other common machine learning algorithms. To the best of our knowledge, this is the first work incorporating deep learning and risk factors to accurately predict market trends.Keywords: deep learning, LSTM, trend prediction, risk management, artificial neural networks
Procedia PDF Downloads 1562197 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions
Authors: Joel Niklaus, Matthias Sturmer
Abstract:
The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling
Procedia PDF Downloads 1482196 Deployment of Attack Helicopters in Conventional Warfare: The Gulf War
Authors: Mehmet Karabekir
Abstract:
Attack helicopters (AHs) are usually deployed in conventional warfare to destroy armored and mechanized forces of enemy. In addition, AHs are able to perform various tasks in the deep, and close operations – intelligence, surveillance, reconnaissance, air assault operations, and search and rescue operations. Apache helicopters were properly employed in the Gulf Wars and contributed the success of campaign by destroying a large number of armored and mechanized vehicles of Iraq Army. The purpose of this article is to discuss the deployment of AHs in conventional warfare in the light of Gulf Wars. First, the employment of AHs in deep and close operations will be addressed regarding the doctrine. Second, the US armed forces AH-64 doctrinal and tactical usage will be argued in the 1st and 2nd Gulf Wars.Keywords: attack helicopter, conventional warfare, gulf wars
Procedia PDF Downloads 4732195 Enhancing Human Security Through Conmprehensive Counter-terrorism Measures
Authors: Alhaji Khuzaima Mohammed Osman, Zaeem Sheikh Abdul Wadudi Haruna
Abstract:
This article aims to explore the crucial link between counter-terrorism efforts and the preservation of human security. As acts of terrorism continue to pose significant threats to societies worldwide, it is imperative to develop effective strategies that mitigate risks while safeguarding the rights and well-being of individuals. This paper discusses key aspects of counter-terrorism and human security, emphasizing the need for a comprehensive approach that integrates intelligence, prevention, response, and resilience-building measures. By highlighting successful case studies and lessons learned, this article provides valuable insights for policymakers, law enforcement agencies, and practitioners in their quest to address terrorism and foster human security.Keywords: human security, risk mitigation, terrorist activities, civil liberties
Procedia PDF Downloads 882194 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators
Authors: Fathi Abid, Bilel Kaffel
Abstract:
The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode
Procedia PDF Downloads 3392193 Pattern Identification in Statistical Process Control Using Artificial Neural Networks
Authors: M. Pramila Devi, N. V. N. Indra Kiran
Abstract:
Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping
Procedia PDF Downloads 3722192 Effectiveness of Gamified Simulators in the Health Sector
Authors: Nuno Biga
Abstract:
The integration of serious games with gamification in management education and training has gained significant importance in recent years as innovative strategies are sought to improve target audience engagement and learning outcomes. This research builds on the author's previous work in this field and presents a case study that evaluates the ex-post impact of a sample of applications of the BIGAMES management simulator in the training of top managers from various hospital institutions. The methodology includes evaluating the reaction of participants after each edition of BIGAMES Accident & Emergency (A&E) carried out over the last 3 years, as well as monitoring the career path of a significant sample of participants and their feedback more than a year after their experience with this simulator. Control groups will be set up, according to the type of role their members held when they took part in the BIGAMES A&E simulator: Administrators, Clinical Directors and Nursing Directors. Former participants are invited to answer a questionnaire structured for this purpose, where they are asked, among other questions, about the importance and impact that the BIGAMES A&E simulator has had on their professional activity. The research methodology also includes an exhaustive literature review, focusing on empirical studies in the field of education and training in management and business that investigate the effectiveness of gamification and serious games in improving learning, team collaboration, critical thinking, problem-solving skills and overall performance, with a focus on training contexts in the health sector. The results of the research carried out show that gamification and serious games that simulate real scenarios, such as Business Interactive Games - BIGAMES©, can significantly increase the motivation and commitment of participants, stimulating the development of transversal skills, the mobilization of group synergies and the acquisition and retention of knowledge through interactive user-centred scenarios. Individuals who participate in game-based learning series show a higher level of commitment to learning because they find these teaching methods more enjoyable and interactive. This research study aims to demonstrate that, as executive education and training programs develop to meet the current needs of managers, gamification and serious games stand out as effective means of bridging the gap between traditional teaching methods and modern educational and training requirements. To this end, this research evaluates the medium/long-term effects of gamified learning on the professional performance of participants in the BIGAMES simulator applied to healthcare. Based on the conclusions of the evaluation of the effectiveness of training using gamification and taking into account the results of the opinion poll of former A&E participants, this research study proposes an integrated approach for the transversal application of the A&E Serious Game in various educational contexts, covering top management (traditionally the target audience of BIGAMES A&E), middle and operational management in healthcare institutions (functional area heads and professionals with career development potential), as well as higher education in medicine and nursing courses. The integrated solution called “BIGAMES A&E plus”, developed as part of this research, includes the digitalization of key processes and the incorporation of AI.Keywords: artificial intelligence (AI), executive training, gamification, higher education, management simulators, serious games (SG), training effectiveness
Procedia PDF Downloads 132191 Application of the Pattern Method to Form the Stable Neural Structures in the Learning Process as a Way of Solving Modern Problems in Education
Authors: Liudmyla Vesper
Abstract:
The problems of modern education are large-scale and diverse. The aspirations of parents, teachers, and experts converge - everyone interested in growing up a generation of whole, well-educated persons. Both the family and society are expected in the future generation to be self-sufficient, desirable in the labor market, and capable of lifelong learning. Today's children have a powerful potential that is difficult to realize in the conditions of traditional school approaches. Focusing on STEM education in practice often ends with the simple use of computers and gadgets during class. "Science", "technology", "engineering" and "mathematics" are difficult to combine within school and university curricula, which have not changed much during the last 10 years. Solving the problems of modern education largely depends on teachers - innovators, teachers - practitioners who develop and implement effective educational methods and programs. Teachers who propose innovative pedagogical practices that allow students to master large-scale knowledge and apply it to the practical plane. Effective education considers the creation of stable neural structures during the learning process, which allow to preserve and increase knowledge throughout life. The author proposed a method of integrated lessons – cases based on the maths patterns for forming a holistic perception of the world. This method and program are scientifically substantiated and have more than 15 years of practical application experience in school and student classrooms. The first results of the practical application of the author's methodology and curriculum were announced at the International Conference "Teaching and Learning Strategies to Promote Elementary School Success", 2006, April 22-23, Yerevan, Armenia, IREX-administered 2004-2006 Multiple Component Education Project. This program is based on the concept of interdisciplinary connections and its implementation in the process of continuous learning. This allows students to save and increase knowledge throughout life according to a single pattern. The pattern principle stores information on different subjects according to one scheme (pattern), using long-term memory. This is how neural structures are created. The author also admits that a similar method can be successfully applied to the training of artificial intelligence neural networks. However, this assumption requires further research and verification. The educational method and program proposed by the author meet the modern requirements for education, which involves mastering various areas of knowledge, starting from an early age. This approach makes it possible to involve the child's cognitive potential as much as possible and direct it to the preservation and development of individual talents. According to the methodology, at the early stages of learning students understand the connection between school subjects (so-called "sciences" and "humanities") and in real life, apply the knowledge gained in practice. This approach allows students to realize their natural creative abilities and talents, which makes it easier to navigate professional choices and find their place in life.Keywords: science education, maths education, AI, neuroplasticity, innovative education problem, creativity development, modern education problem
Procedia PDF Downloads 622190 A Study on Big Data Analytics, Applications and Challenges
Authors: Chhavi Rana
Abstract:
The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, Healthcare, and business intelligence contain voluminous and incremental data, which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organization's decision-making strategy can be enhanced using big data analytics and applying different machine learning techniques and statistical tools on such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates on various frameworks in the process of Analysis using different machine-learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.Keywords: big data, big data analytics, machine learning, review
Procedia PDF Downloads 832189 A Study on Big Data Analytics, Applications, and Challenges
Authors: Chhavi Rana
Abstract:
The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, healthcare, and business intelligence contain voluminous and incremental data which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organisation decision-making strategy can be enhanced by using big data analytics and applying different machine learning techniques and statistical tools to such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates various frameworks in the process of analysis using different machine learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.Keywords: big data, big data analytics, machine learning, review
Procedia PDF Downloads 952188 Big Data Applications for Transportation Planning
Authors: Antonella Falanga, Armando Cartenì
Abstract:
"Big data" refers to extremely vast and complex sets of data, encompassing extraordinarily large and intricate datasets that require specific tools for meaningful analysis and processing. These datasets can stem from diverse origins like sensors, mobile devices, online transactions, social media platforms, and more. The utilization of big data is pivotal, offering the chance to leverage vast information for substantial advantages across diverse fields, thereby enhancing comprehension, decision-making, efficiency, and fostering innovation in various domains. Big data, distinguished by its remarkable attributes of enormous volume, high velocity, diverse variety, and significant value, represent a transformative force reshaping the industry worldwide. Their pervasive impact continues to unlock new possibilities, driving innovation and advancements in technology, decision-making processes, and societal progress in an increasingly data-centric world. The use of these technologies is becoming more widespread, facilitating and accelerating operations that were once much more complicated. In particular, big data impacts across multiple sectors such as business and commerce, healthcare and science, finance, education, geography, agriculture, media and entertainment and also mobility and logistics. Within the transportation sector, which is the focus of this study, big data applications encompass a wide variety, spanning across optimization in vehicle routing, real-time traffic management and monitoring, logistics efficiency, reduction of travel times and congestion, enhancement of the overall transportation systems, but also mitigation of pollutant emissions contributing to environmental sustainability. Meanwhile, in public administration and the development of smart cities, big data aids in improving public services, urban planning, and decision-making processes, leading to more efficient and sustainable urban environments. Access to vast data reservoirs enables deeper insights, revealing hidden patterns and facilitating more precise and timely decision-making. Additionally, advancements in cloud computing and artificial intelligence (AI) have further amplified the potential of big data, enabling more sophisticated and comprehensive analyses. Certainly, utilizing big data presents various advantages but also entails several challenges regarding data privacy and security, ensuring data quality, managing and storing large volumes of data effectively, integrating data from diverse sources, the need for specialized skills to interpret analysis results, ethical considerations in data use, and evaluating costs against benefits. Addressing these difficulties requires well-structured strategies and policies to balance the benefits of big data with privacy, security, and efficient data management concerns. Building upon these premises, the current research investigates the efficacy and influence of big data by conducting an overview of the primary and recent implementations of big data in transportation systems. Overall, this research allows us to conclude that big data better provide to enhance rational decision-making for mobility choices and is imperative for adeptly planning and allocating investments in transportation infrastructures and services.Keywords: big data, public transport, sustainable mobility, transport demand, transportation planning
Procedia PDF Downloads 602187 Ultra-High Voltage Energization of Electrostatic Precipitators for Coal Fired Boilers
Authors: Mads Kirk Larsen
Abstract:
Strict air pollution control is today high on the agenda world-wide. By reducing the particular emission, not only the mg/Nm3 will be reduced – also parts of mercury and other hazardous matters attached to the particles will be reduced. Furthermore, it is possible to catch the fine particles (PM2.5). For particulate control, the precipitators are still the preferred choice and much efforts have been done to improve the efficiencies. Many ESP’s have seen electrical upgrading by changing the traditional 1 phase power system into either 3 phase or SMPS (High Frequency) units. However, there exist a 4th type of power supply – the pulse type. This is unfortunately widely unknown, but may be of great benefit to power plants. The FLSmidth type is called COROMAX® and it is a high voltage pulse generator for precipitators using a semiconductor switch operating at medium potential. The generated high voltage pulses have rated amplitude of 80 kV and duration of 75 μs and are superimposed on a variable base voltage of 60 kV rated voltage. Hereby, achieving a peak voltage of 140 kV. COROMAX® has the ability to increase the voltage beyond the natural spark limit inside the precipitator. Voltage levels may often be twice as high after installation of COROMAX®. Hereby also the migration velocity increases and thereby the efficiency. As the collection efficiency is proportional to the voltage peak and mean values, this also increases the collection efficiency of the fine particles where test has shown 80% removal of particles less than 0.07 micron. Another great advantage is the indifference to back-corona. Simultaneously with emission reduction, the power consumption will also be reduced. Another great advantage of the COROMAX® system is that the emission can be improved without the need to change the internal parts or enlarge the ESP. Recently, more than 150 units have been installed in China, where emissions have been reduced to ultra-low levels.Keywords: eleectrostatic precipitator, high resistivity dust, micropulse energization, particulate removal
Procedia PDF Downloads 3002186 Sample Preparation and Coring of Highly Friable and Heterogeneous Bonded Geomaterials
Authors: Mohammad Khoshini, Arman Khoshghalb, Meghdad Payan, Nasser Khalili
Abstract:
Most of the Earth’s crust surface rocks are technically categorized as weak rocks or weakly bonded geomaterials. Deeply weathered, weakly cemented, friable and easily erodible, they demonstrate complex material behaviour and understanding the overlooked mechanical behaviour of such materials is of particular importance in geotechnical engineering practice. Weakly bonded geomaterials are so susceptible to surface shear and moisture that conventional methods of core drilling fail to extract high-quality undisturbed samples out of them. Moreover, most of these geomaterials are of high heterogeneity rendering less reliable and feasible material characterization. In order to compensate for the unpredictability of the material response, either numerous experiments are needed to be conducted or large factors of safety must be implemented in the design process. However, none of these approaches is sustainable. In this study, a method for dry core drilling of such materials is introduced to take high-quality undisturbed core samples. By freezing the material at certain moisture content, a secondary structure is developed throughout the material which helps the whole structure to remain intact during the core drilling process. Moreover, to address the heterogeneity issue, the natural material was reconstructed artificially to obtain a homogeneous material with very high similarity to the natural one in both micro and macro-mechanical perspectives. The method is verified for both micro and macro scale. In terms of micro-scale analysis, using Scanning Electron Microscopy (SEM), pore spaces and inter-particle bonds were investigated and compared between natural and artificial materials. X-Ray Diffraction, XRD, analyses are also performed to control the chemical composition. At the macro scale, several uniaxial compressive strength tests, as well as triaxial tests, were performed to verify the similar mechanical response of the materials. A high level of agreement is observed between micro and macro results of natural and artificially bonded geomaterials. The proposed methods can play an important role to cut down the costs of experimental programs for material characterization and also to promote the accuracy of the numerical modellings based on the experimental results.Keywords: Artificial geomaterial, core drilling, macro-mechanical behavior, micro-scale, sample preparation, SEM photography, weakly bonded geomaterials
Procedia PDF Downloads 2162185 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition
Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can
Abstract:
To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning
Procedia PDF Downloads 852184 Materials and Techniques of Anonymous Egyptian Polychrome Cartonnage Mummy Mask: A Multiple Analytical Study
Authors: Hanaa A. Al-Gaoudi, Hassan Ebeid
Abstract:
The research investigates the materials and processes used in the manufacturing of an Egyptian polychrome cartonnage mummy mask with the aim of dating this object and establishing trade patterns of certain materials that were used and available at the time of ancient Egypt. This anonymous-source object was held in the basement storage of the Egyptian Museum in Cairo (EMC) and has never been on display. Furthermore, there is no information available regarding its owner, provenance, date, and even the time of its possession by the museum. Moreover, the object is in a very poor condition where almost two-thirds of the mask was bent and has never received any previous conservation treatment. This research has utilized well-established multi-analytical methods to identify the considerable diversity of materials that have been used in the manufacturing of this object. These methods include Computed Tomography Scan (CT scan) to acquire detailed pictures of the inside physical structure and condition of the bended layers. Dino-Lite portable digital microscope, scanning electron microscopy with energy dispersive X-ray spectrometer (SEM-EDX), and the non-invasive imaging technique of multispectral imaging (MSI) to obtain information about the physical characteristics and condition of the painted layers and to examine the microstructure of the materials. Portable XRF Spectrometer (PXRF) and X-Ray powder diffraction (XRD) to identify mineral phases and the bulk element composition in the gilded layer, ground, and pigments; Fourier-transform infrared (FTIR) to identify organic compounds and their molecular characterization; accelerator mass spectrometry (AMS 14C) to date the object. Preliminary results suggest that there are no human remains inside the object, and the textile support is linen fibres with tabby weave 1/1 and these fibres are in a very bad condition. Several pigments have been identified, such as Egyptian blue, Magnetite, Egyptian green frit, Hematite, Calcite, and Cinnabar; moreover, the gilded layers are pure gold and the binding media in the pigments is Arabic gum and animal glue in the textile support layer.Keywords: analytical methods, Egyptian museum, mummy mask, pigments, textile
Procedia PDF Downloads 1252183 Solid Dosages Form Tablet: A Summary on the Article by Shashank Tiwari
Authors: Shashank Tiwari
Abstract:
The most common method of drug delivery is the oral solid dosage form, of which tablets and capsules are predominant. The tablet is more widely accepted and used compared to capsules for a number of reasons, such as cost/price, tamper resistance, ease of handling and packaging, ease of identification, and manufacturing efficiency. Over the past several years, the issue of tamper resistance has resulted in the conversion of most over-the-counter (OTC) drugs from capsules to predominantly all tablets.Keywords: capsule, drug delivery, dosages, solid, tablet
Procedia PDF Downloads 439