Search results for: artificial joints
1865 [Keynote Speech]: Determination of Naturally Occurring and Artificial Radionuclide Activity Concentrations in Marine Sediments in Western Marmara, Turkey
Authors: Erol Kam, Z. U. Yümün
Abstract:
Natural and artificial radionuclides cause radioactive contamination in environments, just as the other non-biodegradable pollutants (heavy metals, etc.) sink to the sea floor and accumulate in sediments. Especially the habitat of benthic foraminifera living on the surface of sediments or in sediments at the seafloor are affected by radioactive pollution in the marine environment. Thus, it is important for pollution analysis to determine the radionuclides. Radioactive pollution accumulates in the lowest level of the food chain and reaches humans at the highest level. The more the accumulation, the more the environment is endangered. This study used gamma spectrometry to investigate the natural and artificial radionuclide distribution of sediment samples taken from living benthic foraminifera habitats in the Western Marmara Sea. The radionuclides, K-40, Cs-137, Ra-226, Mn 54, Zr-95+ and Th-232, were identified in the sediment samples. For this purpose, 18 core samples were taken from depths of about 25-30 meters in the Marmara Sea in 2016. The locations of the core samples were specifically selected exclusively from discharge points for domestic and industrial areas, port locations, and so forth to represent pollution in the study area. Gamma spectrometric analysis was used to determine the radioactive properties of sediments. The radionuclide concentration activity values in the sediment samples obtained were Cs-137=0.9-9.4 Bq/kg, Th-232=18.9-86 Bq/kg, Ra-226=10-50 Bq/kg, K-40=24.4–670 Bq/kg, Mn 54=0.71–0.9 Bq/kg and Zr-95+=0.18–0.19 Bq/kg. These values were compared with the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) data, and an environmental analysis was carried out. The Ra-226 series, the Th-232 series, and the K-40 radionuclides accumulate naturally and are increasing every day due to anthropogenic pollution. Although the Ra-226 values obtained in the study areas remained within normal limits according to the UNSCEAR values, the K-40, and Th-232 series values were found to be high in almost all the locations.Keywords: Ra-226, Th-232, K-40, Cs-137, Mn 54, Zr-95+, radionuclides, Western Marmara Sea
Procedia PDF Downloads 4201864 Application of Artificial Neural Network for Prediction of Retention Times of Some Secoestrane Derivatives
Authors: Nataša Kalajdžija, Strahinja Kovačević, Davor Lončar, Sanja Podunavac Kuzmanović, Lidija Jevrić
Abstract:
In order to investigate the relationship between retention and structure, a quantitative Structure Retention Relationships (QSRRs) study was applied for the prediction of retention times of a set of 23 secoestrane derivatives in a reversed-phase thin-layer chromatography. After the calculation of molecular descriptors, a suitable set of molecular descriptors was selected by using step-wise multiple linear regressions. Artificial Neural Network (ANN) method was employed to model the nonlinear structure-activity relationships. The ANN technique resulted in 5-6-1 ANN model with the correlation coefficient of 0.98. We found that the following descriptors: Critical pressure, total energy, protease inhibition, distribution coefficient (LogD) and parameter of lipophilicity (miLogP) have a significant effect on the retention times. The prediction results are in very good agreement with the experimental ones. This approach provided a new and effective method for predicting the chromatographic retention index for the secoestrane derivatives investigated.Keywords: lipophilicity, QSRR, RP TLC retention, secoestranes
Procedia PDF Downloads 4541863 Nanomaterials for Archaeological Stone Conservation: Re-Assembly of Archaeological Heavy Stones Using Epoxy Resin Modified with Clay Nanoparticles
Authors: Sayed Mansour, Mohammad Aldoasri, Nagib Elmarzugi, Nadia A. Al-Mouallimi
Abstract:
The archaeological large stone used in construction of ancient Pharaonic tombs, temples, obelisks and other sculptures, always subject to physicomechanical deterioration and destructive forces, leading to their partial or total broken. The task of reassembling this type of artifact represent a big challenge for the conservators. Recently, the researchers are turning to new technologies to improve the properties of traditional adhesive materials and techniques used in re-assembly of broken large stone. The epoxy resins are used extensively in stone conservation and re-assembly of broken stone because of their outstanding mechanical properties. The introduction of nanoparticles to polymeric adhesives at low percentages may lead to substantial improvements of their mechanical performances in structural joints and large objects. The aim of this study is to evaluate the effectiveness of clay nanoparticles in enhancing the performances of epoxy adhesives used in re-assembly of archaeological massive stone by adding proper amounts of those nanoparticles. The nanoparticles reinforced epoxy nanocomposite was prepared by direct melt mixing with a nanoparticles content of 3% (w/v), and then mould forming in the form of rectangular samples, and used as adhesive for experimental stone samples. Scanning electron microscopy (SEM) was employed to investigate the morphology of the prepared nanocomposites, and the distribution of nanoparticles inside the composites. The stability and efficiency of the prepared epoxy-nanocomposites and stone block assemblies with new formulated adhesives were tested by aging artificially the samples under different environmental conditions. The effect of incorporating clay nanoparticles on the mechanical properties of epoxy adhesives was evaluated comparatively before and after aging by measuring the tensile, compressive, and Elongation strength tests. The morphological studies revealed that the mixture process between epoxy and nanoparticles has succeeded with a relatively homogeneous morphology and good dispersion in low nano-particles loadings in epoxy matrix was obtained. The results show that the epoxy-clay nanocomposites exhibited superior tensile, compressive, and Elongation strength. Moreover, a marked improvement of the mechanical properties of stone joints increased in all states by adding nano-clay to epoxy in comparison with pure epoxy resin.Keywords: epoxy resins, nanocomposites, clay nanoparticles, re-assembly, archaeological massive stones, mechanical properties
Procedia PDF Downloads 1131862 Evaluation of the Conditions of Managed Aquifer Recharge in the West African Basement Area
Authors: Palingba Aimé Marie Doilkom, Mahamadou Koïta, Jean-michel Vouillamoz, Angelbert Biaou
Abstract:
Most African populations rely on groundwater in rural areas for their consumption. Indeed, in the face of climate change and strong demographic growth, groundwater, particularly in the basement, is increasingly in demand. The question of the sustainability of water resources in this type of environment is therefore becoming a major issue. Groundwater recharge can be natural or artificial. Unlike natural recharge, which often results from the natural infiltration of surface water (e.g. a share of rainfall), artificial recharge consists of causing water infiltration through appropriate developments to artificially replenish the water stock of an aquifer. Artificial recharge is, therefore, one of the measures that can be implemented to secure water supply, combat the effects of climate change, and, more generally, contribute to improving the quantitative status of groundwater bodies. It is in this context that the present research is conducted with the aim of developing artificial recharge in order to contribute to the sustainability of basement aquifers in a context of climatic variability and constantly increasing water needs of populations. In order to achieve the expected results, it is therefore important to determine the characteristics of the infiltration basins and to identify the areas suitable for their implementation. The geometry of the aquifer was reproduced, and the hydraulic properties of the aquifer were collected and characterized, including boundary conditions, hydraulic conductivity, effective porosity, recharge, Van Genuchten parameters, and saturation indices. The aquifer of the Sanon experimental site is made up of three layers, namely the saprolite, the fissured horizon, and the healthy basement. Indeed, the saprolite and the fissured medium were considered for the simulations. The first results with FEFLOW model show that the water table reacts continuously for the first 100 days before stabilizing. The hydraulic charge increases by an average of 1 m. The further away from the basin, the less the water table reacts. However, if a variable hydraulic head is imposed on the basins, it can be seen that the response of the water table is not uniform over time. The lower the basin hydraulic head, the less it affects the water table. These simulations must be continued by improving the characteristics of the basins in order to obtain the appropriate characteristics for a good recharge.Keywords: basement area, FEFLOW, infiltration basin, MAR
Procedia PDF Downloads 761861 Anatomical Investigation of Superficial Fascia Relationships with the Skin and Underlying Tissue in the Greyhound Rump, Thigh, and Crus
Authors: Oday A. Al-Juhaishi, Sa`ad M. Ismail, Hung-Hsun Yen, Christina M. Murray, Helen M. S. Davies
Abstract:
The functional anatomy of the fascia in the greyhound is still poorly understood, and incompletely described. The basic knowledge of fascia stems mainly from anatomical, histological and ultrastructural analyses. In this study, twelve specimens of hindlimbs from six fresh greyhound cadavers (3 male, 3 female) were used to examine the topographical relationships of the superficial fascia with the skin and underlying tissue. The first incision was made along the dorsal midline from the level of the thoracolumbar junction caudally to the level of the mid sacrum. The second incision was begun at the level of the first incision and extended along the midline of the lateral aspect of the hindlimb distally, to just proximal to the tarsus, and, the skin margins carefully separated to observe connective tissue links between the skin and superficial fascia, attachment points of the fascia and the relationships of the fascia with blood vessels that supply the skin. A digital camera was used to record the anatomical features as they were revealed. The dissections identified fibrous septa connecting the skin with the superficial fascia and deep fascia in specific areas. The presence of the adipose tissue was found to be very rare within the superficial fascia in these specimens. On the extensor aspects of some joints, a fusion between the superficial fascia and deep fascia was observed. This fusion created a subcutaneous bursa in the following areas: a prepatellar bursa of the stifle, a tarsal bursa caudal to the calcaneus bone, and an ischiatic bursa caudal to the ischiatic tuberosity. The evaluation of blood vessels showed that the perforating vessels passed through fibrous septa in a perpendicular direction to supply the skin, with the largest branch noted in the gluteal area. The attachment points between the superficial fascia and skin were mainly found in the region of the flexor aspect of the joints, such as caudal to the stifle joint. The numerous fibrous septa between the superficial fascia and skin that have been identified in some areas, may create support for the blood vessels that penetrate fascia and into the skin, while allowing for movement between the tissue planes. The subcutaneous bursae between the skin and the superficial fascia where it is fused with the deep fascia may be useful to decrease friction between moving areas. The adhesion points may be related to the integrity and loading of the skin. The attachment points fix the skin and appear to divide the hindlimb into anatomical compartments.Keywords: attachment points, fibrous septa, greyhound, subcutaneous bursa, superficial fascia
Procedia PDF Downloads 3591860 A Comprehensive Theory of Communication with Biological and Non-Biological Intelligence for a 21st Century Curriculum
Authors: Thomas Schalow
Abstract:
It is commonly recognized that our present curriculum is not preparing students to function in the 21st century. This is particularly true in regard to communication needs across cultures - both human and non-human. In this paper, a comprehensive theory of communication-based on communication with non-human cultures and intelligences is presented to meet the following three imminent contingencies: communicating with sentient biological intelligences, communicating with extraterrestrial intelligences, and communicating with artificial super-intelligences. The paper begins with the argument that we need to become much more serious about communicating with the non-human, intelligent life forms that already exists around us here on Earth. We need to broaden our definition of communication and reach out to other sentient life forms in order to provide humanity with a better perspective of its place within our ecosystem. The paper next examines the science and philosophy behind CETI (communication with extraterrestrial intelligences) and how it could prove useful even in the absence of contact with alien life. However, CETI’s assumptions and methodology need to be revised in accordance with the communication theory being proposed in this paper if we are truly serious about finding and communicating with life beyond Earth. The final theme explored in this paper is communication with non-biological super-intelligences. Humanity has never been truly compelled to converse with other species, and our failure to seriously consider such intercourse has left us largely unprepared to deal with communication in a future that will be mediated and controlled by computer algorithms. Fortunately, our experience dealing with other cultures can provide us with a framework for this communication. The basic concepts behind intercultural communication can be applied to the three types of communication envisioned in this paper if we are willing to recognize that we are in fact dealing with other cultures when we interact with other species, alien life, and artificial super-intelligence. The ideas considered in this paper will require a new mindset for humanity, but a new disposition will yield substantial gains. A curriculum that is truly ready for the 21st century needs to be aligned with this new theory of communication.Keywords: artificial intelligence, CETI, communication, language
Procedia PDF Downloads 3641859 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques
Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas
Abstract:
The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining
Procedia PDF Downloads 1211858 The AI Arena: A Framework for Distributed Multi-Agent Reinforcement Learning
Authors: Edward W. Staley, Corban G. Rivera, Ashley J. Llorens
Abstract:
Advances in reinforcement learning (RL) have resulted in recent breakthroughs in the application of artificial intelligence (AI) across many different domains. An emerging landscape of development environments is making powerful RL techniques more accessible for a growing community of researchers. However, most existing frameworks do not directly address the problem of learning in complex operating environments, such as dense urban settings or defense-related scenarios, that incorporate distributed, heterogeneous teams of agents. To help enable AI research for this important class of applications, we introduce the AI Arena: a scalable framework with flexible abstractions for distributed multi-agent reinforcement learning. The AI Arena extends the OpenAI Gym interface to allow greater flexibility in learning control policies across multiple agents with heterogeneous learning strategies and localized views of the environment. To illustrate the utility of our framework, we present experimental results that demonstrate performance gains due to a distributed multi-agent learning approach over commonly-used RL techniques in several different learning environments.Keywords: reinforcement learning, multi-agent, deep learning, artificial intelligence
Procedia PDF Downloads 1571857 Prediction of Compressive Strength Using Artificial Neural Network
Authors: Vijay Pal Singh, Yogesh Chandra Kotiyal
Abstract:
Structures are a combination of various load carrying members which transfer the loads to the foundation from the superstructure safely. At the design stage, the loading of the structure is defined and appropriate material choices are made based upon their properties, mainly related to strength. The strength of materials kept on reducing with time because of many factors like environmental exposure and deformation caused by unpredictable external loads. Hence, to predict the strength of materials used in structures, various techniques are used. Among these techniques, Non-Destructive Techniques (NDT) are the one that can be used to predict the strength without damaging the structure. In the present study, the compressive strength of concrete has been predicted using Artificial Neural Network (ANN). The predicted strength was compared with the experimentally obtained actual compressive strength of concrete and equations were developed for different models. A good co-relation has been obtained between the predicted strength by these models and experimental values. Further, the co-relation has been developed using two NDT techniques for prediction of strength by regression analysis. It was found that the percentage error has been reduced between the predicted strength by using combined techniques in place of single techniques.Keywords: rebound, ultra-sonic pulse, penetration, ANN, NDT, regression
Procedia PDF Downloads 4281856 Thermalytix: An Advanced Artificial Intelligence Based Solution for Non-Contact Breast Screening
Authors: S. Sudhakar, Geetha Manjunath, Siva Teja Kakileti, Himanshu Madhu
Abstract:
Diagnosis of breast cancer at early stages has seen better clinical and survival outcomes. Survival rates in developing countries like India are very low due to accessibility and affordability issues of screening tests such as Mammography. In addition, Mammography is not much effective in younger women with dense breasts. This leaves a gap in current screening methods. Thermalytix is a new technique for detecting breast abnormality in a non-contact, non-invasive way. It is an AI-enabled computer-aided diagnosis solution that automates interpretation of high resolution thermal images and identifies potential malignant lesions. The solution is low cost, easy to use, portable and is effective in all age groups. This paper presents the results of a retrospective comparative analysis of Thermalytix over Mammography and Clinical Breast Examination for breast cancer screening. Thermalytix was found to have better sensitivity than both the tests, with good specificity as well. In addition, Thermalytix identified all malignant patients without palpable lumps.Keywords: breast cancer screening, radiology, thermalytix, artificial intelligence, thermography
Procedia PDF Downloads 2911855 Kinematical Analysis of Normal Children in Different Age Groups during Gait
Authors: Nawaf Al Khashram, Graham Arnold, Weijie Wang
Abstract:
Background—Gait classifying allows clinicians to differentiate gait patterns into clinically important categories that help in clinical decision making. Reliable comparison of gait data between normal and patients requires knowledge of the gait parameters of normal children's specific age group. However, there is still a lack of the gait database for normal children of different ages. Objectives—The aim of this study is to investigate the kinematics of the lower limb joints during gait for normal children in different age groups. Methods—Fifty-three normal children (34 boys, 19 girls) were recruited in this study. All the children were aged between 5 to 16 years old. Age groups were defined as three types: young child aged (5-7), child (8-11), and adolescent (12-16). When a participant agreed to take part in the project, their parents signed a consent form. Vicon® motion capture system was used to collect gait data. Participants were asked to walk at their comfortable speed along a 10-meter walkway. Each participant walked up to 20 trials. Three good trials were analyzed using the Vicon Plug-in-Gait model to obtain parameters of the gait, e.g., walking speed, cadence, stride length, and joint parameters, e.g. joint angle, force, moments, etc. Moreover, each gait cycle was divided into 8 phases. The range of motion (ROM) angle of pelvis, hip, knee, and ankle joints in three planes of both limbs were calculated using an in-house program. Results—The temporal-spatial variables of three age groups of normal children were compared between each other; it was found that there was a significant difference (p < 0.05) between the groups. The step length and walking speed were gradually increasing from young child to adolescent, while cadence was gradually decreasing from young child to adolescent group. The mean and standard deviation (SD) of the step length of young child, child and adolescent groups were 0.502 ± 0.067 m, 0.566 ± 0.061 m and 0.672 ± 0.053 m, respectively. The mean and SD of the cadence of the young child, child and adolescent groups were 140.11±15.79 step/min, 129±11.84 step/min, and a 115.96±6.47 step/min, respectively. Moreover, it was observed that there were significant differences in kinematic parameters, either whole gait cycle or each phase. For example, RoM of knee angle in the sagittal plane in whole cycle of young child group is (65.03±0.52 deg) larger than child group (63.47±0.47 deg). Conclusion—Our result showed that there are significant differences between each age group in the gait phases and thus children walking performance changes with ages. Therefore, it is important for the clinician to consider age group when analyzing the patients with lower limb disorders before any clinical treatment.Keywords: age group, gait analysis, kinematics, normal children
Procedia PDF Downloads 1191854 The Role of ChatGPT in Enhancing ENT Surgical Training
Authors: Laura Brennan, Ram Balakumar
Abstract:
ChatGPT has been developed by Open AI (Nov 2022) as a powerful artificial intelligence (AI) language model which has been designed to produce human-like text from user written prompts. To gain the most from the system, user written prompts must give context specific information. This article aims to give guidance on how to optimise the ChatGPT system in the context of education for otolaryngology. Otolaryngology is a specialist field which sees little time dedicated to providing education to both medical students and doctors. Additionally, otolaryngology trainees have seen a reduction in learning opportunities since the COVID-19 pandemic. In this article we look at these various barriers to medical education in Otolaryngology training and suggest ways that ChatGPT can overcome them and assist in simulation-based training. Examples provide how this can be achieved using the Authors’ experience to further highlight the practicalities. What this article has found is that while ChatGPT cannot replace traditional mentorship and practical surgical experience, it can serve as an invaluable supplementary resource to simulation based medical education in Otolaryngology.Keywords: artificial intelligence, otolaryngology, surgical training, medical education
Procedia PDF Downloads 1591853 Investigating the Effect of Artificial Intelligence on the Improvement of Green Supply Chain in Industry
Authors: Sepinoud Hamedi
Abstract:
Over the past few decades, companies have appeared developing concerns in connection to the natural affect of their fabricating exercises. Green supply chain administration has been considered by the producers as a attainable choice to decrease the natural affect of operations whereas at the same time moving forward their operational execution. Contemporaneously the coming of digitalization and globalization within the supply chain space has driven to a developing acknowledgment of the importance of data preparing methodologies, such as enormous information analytics and fake insights innovations, in improving and optimizing supply chain execution. Also, supply chain collaboration in part intervenes the relationship between manufactured innovation and supply chain execution Ponders appear that the use of BDA-AI advances includes a significant impact on natural handle integration and green supply chain collaboration conjointly underlines that both natural handle integration and green supply chain collaboration have a critical affect on natural execution. Correspondingly savvy supply chain contributes to green execution through overseeing green connections and setting up green operations.Keywords: green supply chain, artificial intelligence, manufacturers, technology, environmental
Procedia PDF Downloads 731852 Machine Learning Techniques in Bank Credit Analysis
Authors: Fernanda M. Assef, Maria Teresinha A. Steiner
Abstract:
The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines
Procedia PDF Downloads 1031851 Dogmatic Analysis of Legal Risks of Using Artificial Intelligence: The European Union and Polish Perspective
Authors: Marianna Iaroslavska
Abstract:
ChatGPT is becoming commonplace. However, only a few people think about the legal risks of using Large Language Model in their daily work. The main dilemmas concern the following areas: who owns the copyright to what somebody creates through ChatGPT; what can OpenAI do with the prompt you enter; can you accidentally infringe on another creator's rights through ChatGPT; what about the protection of the data somebody enters into the chat. This paper will present these and other legal risks of using large language models at work using dogmatic methods and case studies. The paper will present a legal analysis of AI risks against the background of European Union law and Polish law. This analysis will answer questions about how to protect data, how to make sure you do not violate copyright, and what is at stake with the AI Act, which recently came into force in the EU. If your work is related to the EU area, and you use AI in your work, this paper will be a real goldmine for you. The copyright law in force in Poland does not protect your rights to a work that is created with the help of AI. So if you start selling such a work, you may face two main problems. First, someone may steal your work, and you will not be entitled to any protection because work created with AI does not have any legal protection. Second, the AI may have created the work by infringing on another person's copyright, so they will be able to claim damages from you. In addition, the EU's current AI Act imposes a number of additional obligations related to the use of large language models. The AI Act divides artificial intelligence into four risk levels and imposes different requirements depending on the level of risk. The EU regulation is aimed primarily at those developing and marketing artificial intelligence systems in the EU market. In addition to the above obstacles, personal data protection comes into play, which is very strictly regulated in the EU. If you violate personal data by entering information into ChatGPT, you will be liable for violations. When using AI within the EU or in cooperation with entities located in the EU, you have to take into account a lot of risks. This paper will highlight such risks and explain how they can be avoided.Keywords: EU, AI act, copyright, polish law, LLM
Procedia PDF Downloads 201850 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection
Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari
Abstract:
In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs
Procedia PDF Downloads 3641849 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning
Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu
Abstract:
Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.
Procedia PDF Downloads 891848 Application of Neural Network in Portfolio Product Companies: Integration of Boston Consulting Group Matrix and Ansoff Matrix
Authors: M. Khajezadeh, M. Saied Fallah Niasar, S. Ali Asli, D. Davani Davari, M. Godarzi, Y. Asgari
Abstract:
This study aims to explore the joint application of both Boston and Ansoff matrices in the operational development of the product. We conduct deep analysis, by utilizing the Artificial Neural Network, to predict the position of the product in the market while the company is interested in increasing its share. The data are gathered from two industries, called hygiene and detergent. In doing so, the effort is being made by investigating the behavior of top player companies and, recommend strategic orientations. In conclusion, this combination analysis is appropriate for operational development; as well, it plays an important role in providing the position of the product in the market for both hygiene and detergent industries. More importantly, it will elaborate on the company’s strategies to increase its market share related to a combination of the Boston Consulting Group (BCG) Matrix and Ansoff Matrix.Keywords: artificial neural network, portfolio analysis, BCG matrix, Ansoff matrix
Procedia PDF Downloads 1421847 A Philosophical Investigation into African Conceptions of Personhood in the Fourth Industrial Revolution
Authors: Sanelisiwe Ndlovu
Abstract:
Cities have become testbeds for automation and experimenting with artificial intelligence (AI) in managing urban services and public spaces. Smart Cities and AI systems are changing most human experiences from health and education to personal relations. For instance, in healthcare, social robots are being implemented as tools to assist patients. Similarly, in education, social robots are being used as tutors or co-learners to promote cognitive and affective outcomes. With that general picture in mind, one can now ask a further question about Smart Cities and artificial agents and their moral standing in the African context of personhood. There has been a wealth of literature on the topic of personhood; however, there is an absence of literature on African personhood in highly automated environments. Personhood in African philosophy is defined by the role one can and should play in the community. However, in today’s technologically advanced world, a risk is that machines become more capable of accomplishing tasks that humans would otherwise do. Further, on many African communitarian accounts, personhood and moral standing are associated with active relationality with the community. However, in the Smart City, human closeness is gradually diminishing. For instance, humans already do engage and identify with robotic entities, sometimes even romantically. The primary aim of this study is to investigate how African conceptions of personhood and community interact in a highly automated environment such as Smart Cities. Accordingly, this study lies in presenting a rarely discussed African perspective that emphasizes the necessity and the importance of relationality in handling Smart Cities and AI ethically. Thus, the proposed approach can be seen as the sub-Saharan African contribution to personhood and the growing AI debates, which takes the reality of the interconnectedness of society seriously. And it will also open up new opportunities to tackle old problems and use existing resources to confront new problems in the Fourth Industrial Revolution.Keywords: smart city, artificial intelligence, personhood, community
Procedia PDF Downloads 2021846 Profit-Based Artificial Neural Network (ANN) Trained by Migrating Birds Optimization: A Case Study in Credit Card Fraud Detection
Authors: Ashkan Zakaryazad, Ekrem Duman
Abstract:
A typical classification technique ranks the instances in a data set according to the likelihood of belonging to one (positive) class. A credit card (CC) fraud detection model ranks the transactions in terms of probability of being fraud. In fact, this approach is often criticized, because firms do not care about fraud probability but about the profitability or costliness of detecting a fraudulent transaction. The key contribution in this study is to focus on the profit maximization in the model building step. The artificial neural network proposed in this study works based on profit maximization instead of minimizing the error of prediction. Moreover, some studies have shown that the back propagation algorithm, similar to other gradient–based algorithms, usually gets trapped in local optima and swarm-based algorithms are more successful in this respect. In this study, we train our profit maximization ANN using the Migrating Birds optimization (MBO) which is introduced to literature recently.Keywords: neural network, profit-based neural network, sum of squared errors (SSE), MBO, gradient descent
Procedia PDF Downloads 4751845 Thermal Barrier Coated Diesel Engine With Neural Networks Mathematical Modelling
Authors: Hanbey Hazar, Hakan Gul
Abstract:
In this study; piston, exhaust, and suction valves of a diesel engine were coated in 300 mm thickness with Tungsten Carbide (WC) by using the HVOF coating method. Mathematical modeling of a coated and uncoated (standardized) engine was performed by using ANN (Artificial Neural Networks). The purpose was to decrease the number of repetitions of tests and reduce the test cost through mathematical modeling of engines by using ANN. The results obtained from the tests were entered in ANN and therefore engines' values at all speeds were estimated. Results obtained from the tests were compared with those obtained from ANN and they were observed to be compatible. It was also observed that, with thermal barrier coating, hydrocarbon (HC), carbon monoxide (CO), and smoke density values of the diesel engine decreased; but nitrogen oxides (NOx) increased. Furthermore, it was determined that results obtained through mathematical modeling by means of ANN reduced the number of test repetitions. Therefore, it was understood that time, fuel and labor could be saved in this way.Keywords: Artificial Neural Network, Diesel Engine, Mathematical Modelling, Thermal Barrier Coating
Procedia PDF Downloads 5281844 Modeling of Digital and Settlement Consolidation of Soil under Oedomete
Authors: Yu-Lin Shen, Ming-Kuen Chang
Abstract:
In addition to a considerable amount of machinery and equipment, intricacies of the transmission pipeline exist in Petrochemical plants. Long term corrosion may lead to pipeline thinning and rupture, causing serious safety concerns. With the advances in non-destructive testing technology, more rapid and long-range ultrasonic detection techniques are often used for pipeline inspection, EMAT without coupling to detect, it is a non-contact ultrasonic, suitable for detecting elevated temperature or roughened e surface of line. In this study, we prepared artificial defects in pipeline for Electromagnetic Acoustic Transducer Testing (EMAT) to survey the relationship between the defect location, sizing and the EMAT signal. It was found that the signal amplitude of EMAT exhibited greater signal attenuation with larger defect depth and length.. In addition, with bigger flat hole diameter, greater amplitude attenuation was obtained. In summary, signal amplitude attenuation of EMAT was affected by the defect depth, defect length and the hole diameter and size.Keywords: EMAT, artificial defect, NDT, ultrasonic testing
Procedia PDF Downloads 3331843 Control HVAC Parameters by Brain Emotional Learning Based Intelligent Controller (BELBIC)
Authors: Javad Abdi, Azam Famil Khalili
Abstract:
Modeling emotions have attracted much attention in recent years, both in cognitive psychology and design of artificial systems. However, it is a negative factor in decision-making; emotions have shown to be a strong faculty for making fast satisfying decisions. In this paper, we have adapted a computational model based on the limbic system in the mammalian brain for control engineering applications. Learning in this model based on Temporal Difference (TD) Learning, we applied the proposed controller (termed BELBIC) for a simple model of a submarine. The model was supposed to reach the desired depth underwater. Our results demonstrate excellent control action, disturbance handling, and system parameter robustness for TDBELBIC. The proposal method, regarding the present conditions, the system action in the part and the controlling aims, can control the system in a way that these objectives are attained in the least amount of time and the best way.Keywords: artificial neural networks, temporal difference, brain emotional learning based intelligent controller, heating- ventilating and air conditioning
Procedia PDF Downloads 4331842 Detect Critical Thinking Skill in Written Text Analysis. The Use of Artificial Intelligence in Text Analysis vs Chat/Gpt
Authors: Lucilla Crosta, Anthony Edwards
Abstract:
Companies and the market place nowadays struggle to find employees with adequate skills in relation to anticipated growth of their businesses. At least half of workers will need to undertake some form of up-skilling process in the next five years in order to remain aligned with the requests of the market . In order to meet these challenges, there is a clear need to explore the potential uses of AI (artificial Intelligence) based tools in assessing transversal skills (critical thinking, communication and soft skills of different types in general) of workers and adult students while empowering them to develop those same skills in a reliable trustworthy way. Companies seek workers with key transversal skills that can make a difference between workers now and in the future. However, critical thinking seems to be the one of the most imprtant skill, bringing unexplored ideas and company growth in business contexts. What employers have been reporting since years now, is that this skill is lacking in the majority of workers and adult students, and this is particularly visible trough their writing. This paper investigates how critical thinking and communication skills are currently developed in Higher Education environments through use of AI tools at postgraduate levels. It analyses the use of a branch of AI namely Machine Learning and Big Data and of Neural Network Analysis. It also examines the potential effect the acquisition of these skills through AI tools and what kind of effects this has on employability This paper will draw information from researchers and studies both at national (Italy & UK) and international level in Higher Education. The issues associated with the development and use of one specific AI tool Edulai, will be examined in details. Finally comparisons will be also made between these tools and the more recent phenomenon of Chat GPT and forthcomings and drawbacks will be analysed.Keywords: critical thinking, artificial intelligence, higher education, soft skills, chat GPT
Procedia PDF Downloads 1101841 Investigating Data Normalization Techniques in Swarm Intelligence Forecasting for Energy Commodity Spot Price
Authors: Yuhanis Yusof, Zuriani Mustaffa, Siti Sakira Kamaruddin
Abstract:
Data mining is a fundamental technique in identifying patterns from large data sets. The extracted facts and patterns contribute in various domains such as marketing, forecasting, and medical. Prior to that, data are consolidated so that the resulting mining process may be more efficient. This study investigates the effect of different data normalization techniques, which are Min-max, Z-score, and decimal scaling, on Swarm-based forecasting models. Recent swarm intelligence algorithms employed includes the Grey Wolf Optimizer (GWO) and Artificial Bee Colony (ABC). Forecasting models are later developed to predict the daily spot price of crude oil and gasoline. Results showed that GWO works better with Z-score normalization technique while ABC produces better accuracy with the Min-Max. Nevertheless, the GWO is more superior that ABC as its model generates the highest accuracy for both crude oil and gasoline price. Such a result indicates that GWO is a promising competitor in the family of swarm intelligence algorithms.Keywords: artificial bee colony, data normalization, forecasting, Grey Wolf optimizer
Procedia PDF Downloads 4751840 Machine Learning Based Gender Identification of Authors of Entry Programs
Authors: Go Woon Kwak, Siyoung Jun, Soyun Maeng, Haeyoung Lee
Abstract:
Entry is an education platform used in South Korea, created to help students learn to program, in which they can learn to code while playing. Using the online version of the entry, teachers can easily assign programming homework to the student and the students can make programs simply by linking programming blocks. However, the programs may be made by others, so that the authors of the programs should be identified. In this paper, as the first step toward author identification of entry programs, we present an artificial neural network based classification approach to identify genders of authors of a program written in an entry. A neural network has been trained from labeled training data that we have collected. Our result in progress, although preliminary, shows that the proposed approach could be feasible to be applied to the online version of entry for gender identification of authors. As future work, we will first use a machine learning technique for age identification of entry programs, which would be the second step toward the author identification.Keywords: artificial intelligence, author identification, deep neural network, gender identification, machine learning
Procedia PDF Downloads 3231839 Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence
Authors: Sylvester Akpah, Selasi Vondee
Abstract:
Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle.Keywords: artificial ntelligence, chatbot, natural language processing, unmanned aerial vehicle
Procedia PDF Downloads 1421838 Application of Artificial Neural Network Technique for Diagnosing Asthma
Authors: Azadeh Bashiri
Abstract:
Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.Keywords: asthma, data mining, Artificial Neural Network, intelligent system
Procedia PDF Downloads 2731837 Extension of Moral Agency to Artificial Agents
Authors: Sofia Quaglia, Carmine Di Martino, Brendan Tierney
Abstract:
Artificial Intelligence (A.I.) constitutes various aspects of modern life, from the Machine Learning algorithms predicting the stocks on Wall streets to the killing of belligerents and innocents alike on the battlefield. Moreover, the end goal is to create autonomous A.I.; this means that the presence of humans in the decision-making process will be absent. The question comes naturally: when an A.I. does something wrong when its behavior is harmful to the community and its actions go against the law, which is to be held responsible? This research’s subject matter in A.I. and Robot Ethics focuses mainly on Robot Rights and its ultimate objective is to answer the questions: (i) What is the function of rights? (ii) Who is a right holder, what is personhood and the requirements needed to be a moral agent (therefore, accountable for responsibility)? (iii) Can an A.I. be a moral agent? (ontological requirements) and finally (iv) if it ought to be one (ethical implications). With the direction to answer this question, this research project was done via a collaboration between the School of Computer Science in the Technical University of Dublin that oversaw the technical aspects of this work, as well as the Department of Philosophy in the University of Milan, who supervised the philosophical framework and argumentation of the project. Firstly, it was found that all rights are positive and based on consensus; they change with time based on circumstances. Their function is to protect the social fabric and avoid dangerous situations. The same goes for the requirements considered necessary to be a moral agent: those are not absolute; in fact, they are constantly redesigned. Hence, the next logical step was to identify what requirements are regarded as fundamental in real-world judicial systems, comparing them to that of ones used in philosophy. Autonomy, free will, intentionality, consciousness and responsibility were identified as the requirements to be considered a moral agent. The work went on to build a symmetrical system between personhood and A.I. to enable the emergence of the ontological differences between the two. Each requirement is introduced, explained in the most relevant theories of contemporary philosophy, and observed in its manifestation in A.I. Finally, after completing the philosophical and technical analysis, conclusions were drawn. As underlined in the research questions, there are two issues regarding the assignment of moral agency to artificial agent: the first being that all the ontological requirements must be present and secondly being present or not, whether an A.I. ought to be considered as an artificial moral agent. From an ontological point of view, it is very hard to prove that an A.I. could be autonomous, free, intentional, conscious, and responsible. The philosophical accounts are often very theoretical and inconclusive, making it difficult to fully detect these requirements on an experimental level of demonstration. However, from an ethical point of view it makes sense to consider some A.I. as artificial moral agents, hence responsible for their own actions. When considering artificial agents as responsible, there can be applied already existing norms in our judicial system such as removing them from society, and re-educating them, in order to re-introduced them to society. This is in line with how the highest profile correctional facilities ought to work. Noticeably, this is a provisional conclusion and research must continue further. Nevertheless, the strength of the presented argument lies in its immediate applicability to real world scenarios. To refer to the aforementioned incidents, involving the murderer of innocents, when this thesis is applied it is possible to hold an A.I. accountable and responsible for its actions. This infers removing it from society by virtue of its un-usability, re-programming it and, only when properly functioning, re-introducing it successfullyKeywords: artificial agency, correctional system, ethics, natural agency, responsibility
Procedia PDF Downloads 1881836 A Medical Resource Forecasting Model for Emergency Room Patients with Acute Hepatitis
Authors: R. J. Kuo, W. C. Cheng, W. C. Lien, T. J. Yang
Abstract:
Taiwan is a hyper endemic area for the Hepatitis B virus (HBV). The estimated total number of HBsAg carriers in the general population who are more than 20 years old is more than 3 million. Therefore, a case record review is conducted from January 2003 to June 2007 for all patients with a diagnosis of acute hepatitis who were admitted to the Emergency Department (ED) of a well-known teaching hospital. The cost for the use of medical resources is defined as the total medical fee. In this study, principal component analysis (PCA) is firstly employed to reduce the number of dimensions. Support vector regression (SVR) and artificial neural network (ANN) are then used to develop the forecasting model. A total of 117 patients meet the inclusion criteria. 61% patients involved in this study are hepatitis B related. The computational result shows that the proposed PCA-SVR model has superior performance than other compared algorithms. In conclusion, the Child-Pugh score and echogram can both be used to predict the cost of medical resources for patients with acute hepatitis in the ED.Keywords: acute hepatitis, medical resource cost, artificial neural network, support vector regression
Procedia PDF Downloads 422