Search results for: legal artificial intelligence
2023 Genetics, Law and Society: Regulating New Genetic Technologies
Authors: Aisling De Paor
Abstract:
Scientific and technological developments are driving genetics and genetic technologies into the public sphere. Scientists are making genetic discoveries as to the make up of the human body and the cause and effect of disease, diversity and disability amongst individuals. Technological innovation in the field of genetics is also advancing, with the development of genetic testing, and other emerging genetic technologies, including gene editing (which offers the potential for genetic modification). In addition to the benefits for medicine, health care and humanity, these genetic advances raise a range of ethical, legal and societal concerns. From an ethical perspective, such advances may, for example, change the concept of humans and what it means to be human. Science may take over in conceptualising human beings, which may push the boundaries of existing human rights. New genetic technologies, particularly gene editing techniques create the potential to stigmatise disability, by highlighting disability or genetic difference as something that should be eliminated or anticipated. From a disability perspective, use (and misuse) of genetic technologies raise concerns about discrimination and violations to the dignity and integrity of the individual. With an acknowledgement of the likely future orientation of genetic science, and in consideration of the intersection of genetics and disability, this paper highlights the main concerns raised as genetic science and technology advances (particularly with gene editing developments), and the consequences for disability and human rights. Through the use of traditional doctrinal legal methodologies, it investigates the use (and potential misuse) of gene editing as creating the potential for a unique form of discrimination and stigmatization to develop, as well as a potential gateway to a form of new, subtle eugenics. This article highlights the need to maintain caution as to the use, application and the consequences of genetic technologies. With a focus on the law and policy position in Europe, it examines the need to control and regulate these new technologies, particularly gene editing. In addition to considering the need for regulation, this paper highlights non-normative approaches to address this area, including awareness raising and education, public discussion and engagement with key stakeholders in the field and the development of a multifaceted genetics advisory network.Keywords: disability, gene-editing, genetics, law, regulation
Procedia PDF Downloads 3622022 Cement Bond Characteristics of Artificially Fabricated Sandstones
Authors: Ashirgul Kozhagulova, Ainash Shabdirova, Galym Tokazhanov, Minh Nguyen
Abstract:
The synthetic rocks have been advantageous over the natural rocks in terms of availability and the consistent studying the impact of a particular parameter. The artificial rocks can be fabricated using variety of techniques such as mixing sand and Portland cement or gypsum, firing the mixture of sand and fine powder of borosilicate glass or by in-situ precipitation of calcite solution. In this study, sodium silicate solution has been used as the cementing agent for the quartz sand. The molded soft cylindrical sandstone samples are placed in the gas-tight pressure vessel, where the hardening of the material takes place as the chemical reaction between carbon dioxide and the silicate solution progresses. The vessel allows uniform disperse of carbon dioxide and control over the ambient gas pressure. Current paper shows how the bonding material is initially distributed in the intergranular space and the surface of the sand particles by the usage of Electron Microscopy and the Energy Dispersive Spectroscopy. During the study, the strength of the cement bond as a function of temperature is observed. The impact of cementing agent dosage on the micro and macro characteristics of the sandstone is investigated. The analysis of the cement bond at micro level helps to trace the changes to particles bonding damage after a potential yielding. Shearing behavior and compressional response have been examined resulting in the estimation of the shearing resistance and cohesion force of the sandstone. These are considered to be main input values to the mathematical prediction models of sand production from weak clastic oil reservoir formations.Keywords: artificial sanstone, cement bond, microstructure, SEM, triaxial shearing
Procedia PDF Downloads 1692021 Prisoners for Sexual Offences: Custodial Regime, Prison Experience and Reintegration Interventions
Authors: Nikolaos Koulouris, Anna Kasapoglou, Dimitris Koros
Abstract:
The paper aims to present the course of ongoing research concerning the treatment of pretrial detainees, convicted or released prisoners for sexual offenses, an area that has not received much attention in Greece in terms of the prison experience and the reintegration potentials regarding this specific category of prisoners. The study plan provides for the use of a combination of research methods (focus groups with prisoners, structured individual interviews with prisoners and prison staff). Also, interviews with ex-prisoners detained regarding sexual offenses will take place. In Greece, there are no special provisions for the treatment of sexual offenders in prison, nor are there any special programs in place for their rehabilitation. Sexual offenders are usually separated from other prisoners, as the informal code of the social organization of the prison community dictates, despite no relevant legal framework. The study aims to explore the reasons for the separate detention of sexual offenders and discuss their special (non) treatment from different points of view, namely the legality and legitimacy of this discriminatory practice in terms of prisoners’ protection, safety, stigmatization, and possible social exclusion, as well as their post-release expectations and social reintegration potentials. The purpose of the research is the exploration of the prison experience of sexual offenders, the exercise of their legal rights, their adjustment to the demands of social life in prison, as well as the role of prison officers and various interventions aiming to their preparation for reentry to society. The study will take into consideration the European and international prison/penitentiary standards and best practices in order to examine the issue comparatively, while the contribution of the United Nations and the Council of Europe and its standards will be used to assess the treatment of sexual offenders in terms of its compatibility to international and European model-rules and trends. The outcome will be utilized to form main directions and propositions for a coherent and consistent human rights-based and social integration-oriented penal policy regarding the treatment of persons accused or convicted of sexual offenses in Greece.Keywords: prisoners’ treatment, sex offenders, social exclusion, social reintegration
Procedia PDF Downloads 1562020 Examining the Factors Impeding the Preservation of African Architectural Heritage
Authors: Okafor Calistus Chibuzor
Abstract:
Preserving African architectural heritage is a multifaceted endeavor that intersects with socio-cultural, economic, and environmental factors. Despite growing recognition of the importance of safeguarding these invaluable cultural assets, numerous challenges persist, hindering effective preservation efforts across the continent. This paper investigates the underlying factors impeding the preservation of African architectural heritage, aiming to provide insights for addressing this critical issue. The study begins with an exploration of the historical background and significance of African architectural heritage, highlighting its rich diversity and cultural significance. The study acknowledges that there is an urgent need to address the threats facing these heritage sites, including urbanization, rapid development, lack of funding, inadequate legal protection, and insufficient public awareness. The primary aim of this research is to identify and analyze the key factors contributing to the deterioration and loss of African architectural heritage, with the objective of formulating strategies to mitigate these challenges. A mixed-use research methodology combining archival research, field surveys, stakeholder interviews, and case studies is employed to gather comprehensive data and insights. The findings reveal a complex interplay of socio-economic, political, and institutional factors shaping the preservation landscape in Africa, including issues related to funding, governance, community engagement, and capacity building. The paper concludes by highlighting the urgent need for coordinated efforts among government agencies, heritage organizations, local communities, and international stakeholders to address the identified challenges and develop sustainable preservation strategies. Recommendations are provided for enhancing legal frameworks, promoting community involvement, fostering public awareness, and mobilizing resources to safeguard Africa's rich architectural heritage for future generations.Keywords: African architectural heritage, preservation challenges, preservation strategies, factors
Procedia PDF Downloads 632019 Vertical Urbanization Over Public Structures: The Example of Mostar Junction in Belgrade, Serbia
Authors: Sladjana Popovic
Abstract:
The concept of vertical space urbanization, defined in English as "air rights development," can be considered a mechanism for the development of public spaces in urban areas of high density. A chronological overview of the transformation of space within the vertical projection of the existing traffic infrastructure that penetrates through the central areas of a city is given in this paper through the analysis of two illustrative case studies: more advanced and recent - "Plot 13" in Boston, and less well-known European example of structures erected above highways throughout Italy - the "Pavesi auto grill" chain. The backbone of this analysis is the examination of the possibility of yielding air rights within the vertical projection of public structures in the two examples by considering the factors that would enable its potential application in capitals in Southeastern Europe. The cession of air rights in the Southeastern Europe region, as a phenomenon, has not been a recognized practice in urban planning. In a formal sense, legal and physical feasibility can be seen to some extent in local models of structures built above protected historical heritage (i.e., archaeological sites); however, the mechanisms of the legal process of assigning the right to use and develop air rights above public structures is not a recognized concept. The goal of the analysis is to shed light on the influence of institutional participants in the implementation of innovative solutions for vertical urbanization, as well as strategic planning mechanisms in public-private partnership models that would enable the implementation of the concept in the region. The main question is whether the manipulation of the vertical projection of space could provide for innovative urban solutions that overcome the deficit and excessive use of the available construction land, particularly above the dominant public spaces and traffic infrastructure that penetrate central parts of a city. Conclusions reflect upon vertical urbanization that can bridge the spatial separation of the city, reduce noise pollution and contribute to more efficient urban planning along main transportation corridors.Keywords: air rights development, innovative urbanism, public-private partnership, transport infrastructure, vertical urbanization
Procedia PDF Downloads 772018 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance
Authors: Clement Yeboah, Eva Laryea
Abstract:
A pretest-posttest within subjects experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant, indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant, indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop an interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers and will continue to be a dynamic and rapidly evolving field for years to come.Keywords: pretest-posttest within subjects, computer game-based learning, statistics achievement, statistics anxiety
Procedia PDF Downloads 782017 The Role of Lifetime Stress in the Relation between Socioeconomic Status and Health-Risk Behaviors
Authors: Teresa Smith, Farrah Jacquez
Abstract:
Health-risk behaviors (e.g., smoking, poor diet) directly increase the risk for chronic disease and morbidity. There is substantial evidence of a negative association between socioeconomic status (SES) and engagement in health-risk behaviors. However, due to the complexity of SES, researchers have suggested looking beyond this factor to fully understand the mechanisms that underlie engagement in health-risk behaviors. Stress is one plausible mechanism through which SES impacts health-risk behaviors. Currently, it remains unclear how stress occurring across the life course might impact health behaviors and explain the association between SES and these behaviors. To address the gaps in the literature, 172 adults between the ages of 18-49 were surveyed about their lifetime stress exposure, sociodemographic variables, and health-risk behaviors via an online recruitment portal, Prolific. Five major findings emerged from the current study. First, SES was negatively associated with engagement in health-risk behaviors and lifetime stress above and beyond current stress and other relevant demographics. Second, lifetime stress was significantly associated with health-risk behaviors above and beyond current stress and relevant demographic variables. Third, lifetime stress fully mediated the association between SES and health-risk behaviors above and beyond current stress and other demographics. Fourth, the severity of stress experienced emerged as the most significant lifetime stress variable that explains the relation between SES and health-risk behaviors. Fifth and finally, lower SES and experiencing financial and legal/crime stressors increased the likelihood of engaging in health-risk behaviors. The current study results align with previous research and suggest that stress occurring over the lifespan impacts the relation between SES and health-risk behaviors, which are in turn known to impact health outcomes. However, our findings move the current literature forward by providing a more nuanced understanding of the specific aspects of stress that influence this association. Specifically, the severity of stress experienced across the entire lifespan was the most important aspect of stress when examining the association between SES and health-risk behaviors. Further, individuals most at risk for engaging in health-risk behaviors are those of the lowest SES and experience financial and legal/crime stressors. These findings have the potential to inform interventions and policies aimed at addressing health-risk behaviors by providing a more sophisticated understanding of the impact of stress.Keywords: stress, health behaviors, socioeconomic status, health
Procedia PDF Downloads 1472016 Evolution of Web Development Progress in Modern Information Technology
Authors: Abdul Basit Kiani
Abstract:
Web development, the art of creating and maintaining websites, has witnessed remarkable advancements. The aim is to provide an overview of some of the cutting-edge developments in the field. Firstly, the rise of responsive web design has revolutionized user experiences across devices. With the increasing prevalence of smartphones and tablets, web developers have adapted to ensure seamless browsing experiences, regardless of screen size. This progress has greatly enhanced accessibility and usability, catering to the diverse needs of users worldwide. Additionally, the evolution of web frameworks and libraries has significantly streamlined the development process. Tools such as React, Angular, and Vue.js have empowered developers to build dynamic and interactive web applications with ease. These frameworks not only enhance efficiency but also bolster scalability, allowing for the creation of complex and feature-rich web solutions. Furthermore, the emergence of progressive web applications (PWAs) has bridged the gap between native mobile apps and web development. PWAs leverage modern web technologies to deliver app-like experiences, including offline functionality, push notifications, and seamless installation. This innovation has transformed the way users interact with websites, blurring the boundaries between traditional web and mobile applications. Moreover, the integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.Keywords: progressive web applications (PWAs), web security, machine learning (ML), web frameworks, advancement responsive web design
Procedia PDF Downloads 542015 Multidisciplinary Approach to Mio-Plio-Quaternary Aquifer Study in the Zarzis Region (Southeastern Tunisia)
Authors: Ghada Ben Brahim, Aicha El Rabia, Mohamed Hedi Inoubli
Abstract:
Climate change has exacerbated disparities in the distribution of water resources in Tunisia, resulting in significant degradation in quantity and quality over the past five decades. The Mio-Plio-Quaternary aquifer, the primary water source in the Zarzis region, is subject to climatic, geographical, and geological challenges, as well as human stress. The region is experiencing uneven distribution and growing threats from groundwater salinity and saltwater intrusion. Addressing this challenge is critical for the arid region’s socioeconomic development, and effective water resource management is required to combat climate change and reduce water deficits. This study uses a multidisciplinary approach to determine the groundwater potential of this aquifer, involving geophysics and hydrogeology data analysis. We used advanced techniques such as 3D Euler deconvolution and power spectrum analysis to generate detailed anomaly maps and estimate the depths of density sources, identifying significant Bouguer anomalies trending E-W, NW-SE, and NE-SW. Various techniques, such as wavelength filtering, upward continuation, and horizontal and vertical derivatives, were used to improve the gravity data, resulting in consistent results for anomaly shapes and amplitudes. The Euler deconvolution method revealed two prominent surface faults, trending NE-SW and NW-SE, that have a significant impact on the distribution of sedimentary facies and water quality within the Mio-Plio-Quaternary aquifer. Additionally, depth maxima greater than 1400 m to the North indicate the presence of a Cretaceous paleo-fault. Geoelectrical models and resistivity pseudo-sections were used to interpret the distribution of electrical facies in the Mio-Plio-Quaternary aquifer, highlighting lateral variation and depositional environment type. AI optimises the analysis and interpretation of exploration data, which is important to long-term management and water security. Machine learning algorithms and deep learning models analyse large datasets to provide precise interpretations of subsurface conditions, such as aquifer salinisation. However, AI has limitations, such as the requirement for large datasets, the risk of overfitting, and integration issues with traditional geological methods.Keywords: mio-plio-quaternary aquifer, Southeastern Tunisia, geophysical methods, hydrogeological analysis, artificial intelligence
Procedia PDF Downloads 202014 The Use of Religious Symbols in the Workplace: Remarks on the Latest Case Law
Authors: Susana Sousa Machado
Abstract:
The debate on the use of religious symbols has been highlighted in modern societies, especially in the field of labour relationships. As litigiousness appears to be growing, the matter requires a careful study from a legal perspective. In this context, a description and critical analysis of the most recent case law is conducted regarding the use of symbols by the employee in the workplace, delivered both by the European Court of Human Rights and by the Court of Justice of the European Union. From this comparative analysis we highlight the most relevant aspects in order to seek a common core regarding the juridical-argumentative approach of case law.Keywords: religion, religious symbols, workplace, discrimination
Procedia PDF Downloads 4212013 The Marriage of a Sui Juris Girl: Permission of Wali (Guardian) or Consent of Ward in the Context of Personal Law in Pakistan
Authors: Muhammad Farooq
Abstract:
The present article explores the woman's consent as a paramount element in contracting a Muslim marriage. Also, whether permission of the wali (guardian) is a condition per se for a valid nikah (marriage deed) in the eye of law and Sharia. The researcher attempts to treat it through the related issues, inter alia; the marriage guardian, the women's legal capacity to give consent whether she is a virgin or nonvirgin and how that consent is to be given or may be understood. Does her laugh, tears or salience needs a legal interpretation as well as other female manifestations of emotion explained by the Muslim jurists? The silence of Muslim Family Law Ordinance 1961 (hereafter; MFLO 1961) in this regard and the likely reasons behind such silence is also inquired in brief. Germane to the theme, the various cases in which the true notion of woman's consent is interpreted by courts in Pakistan are also examined. In order to address the issue in hand, it is proposed to provide a brief overview of a few contemporary writers' opinions in which the real place of woman's consent in Muslim marriage is highlighted. Key to the idea of young Muslim woman's marriage, the doctrine of kafa'a (equality or suitability) between the man and woman is argued here to be grounded in the patriarchal and social norms. It is, therefore, concluded that such concept was the result of analogical reasoning and has less importance in the present time. As such it is not a valid factor in current scenarios to validate or invalidate marital bonds. A standard qualitative convention is used for this research. Among primary and secondary sources; for examples, Qur'an, Sunnah, Books, Scholarly articles, texts of law and case law is used to point out the researcher's view. In summation, the article is concluded with a bold statement that a young woman being a party to the contract, is absolutely entitled to 'full and free' consent for the Muslim marriage contract. It is the woman, an indispensable partaker and her consent (not the guardian' permission) that does validate or invalidate the said agreement in the eye of contemporary personal law and in Sharia.Keywords: consent of woman, ejab (declaration), Nikah (marriage agreement), qabol (acceptance), sui juris (of age; independent), wali (guardian), wilayah (guardianship)
Procedia PDF Downloads 1382012 The Urgenda and Juliana Cases: Redefining the Notion of Environmental Democracy
Authors: Valentina Dotto
Abstract:
Climate change cases used to take the form of statutory disputes rather than constitutional or common law disputes. This changed in 2015, with the Urgenda Climate case in the Netherlands (Urgenda Foundation v. The State of the Netherlands, C/09/456689/HAZA 13-1396) and, the Juliana case in the U.S. (United States v. U.S. District Court for District of Oregon, 17-71692, 9th Cir.). The two cases represent a new type of climate litigation, the claims brought against the federal government were in fact grounded in constitutional rights. The complaints used the Doctrine of Public Trust as a cornerstone for the lawsuits asserting that government's actions against climate change failed to protect essential public trust resources; thus, violating a generation's constitutional rights to life, liberty, and property. The Public Trust Doctrine –a quintessentially American legal concept-, reserved to the States by virtue of the 9th and 10th amendment of the federal Constitution, gives them considerable jurisdiction over natural resources and has been refined by a number of Supreme Court rulings. The Juliana case exemplifies the Doctrine’s evolutionary nature because it attempts to apply it to the federal government, and establish a right to a climate system capable of sustaining human life as a fundamental right protected by a substantive due process. Furthermore, the flexibility of the Doctrine makes it permissible to be applied to a variety of different legal systems as in the Urgenda case. At the very heart of the lawsuits stands the question of who owns the Earth resources and, to what extent the general public can claim the services that the Earth provides as common property. By employing the widest possible definition of the Doctrine of Public Trust these lawsuits tried to redefine environmental resources as a collective right of all people. By doing case analysis, the paper explores how these cases can contribute to widening the public access to information and broadening the public voice in decision making as well as providing a precedent to equal access in seeking justice and redress from environmental failures.Keywords: climate change, doctrine of public trust, environmental democracy, Juliana case, Urgenda climate case
Procedia PDF Downloads 1752011 The Case for Reparations: Systemic Injustice and Human Rights in the United States
Authors: Journey Whitfield
Abstract:
This study investigates the United States' ongoing violation of Black Americans' fundamental human rights, as evidenced by mass incarceration, social injustice, and economic deprivation. It argues that the U.S. contravenes Article 9 of the International Covenant on Civil and Political Rights through policies that uphold systemic racism. The analysis dissects current practices within the criminal justice system, social welfare programs, and economic policy, uncovering the racially disparate impacts of seemingly race-neutral policies. This study establishes a clear lineage between past systems of oppression – slavery and Jim Crow – and present-day racial disparities, demonstrating their inextricable link. The thesis proposes that only a comprehensive reparations program for Black Americans can begin to redress these systemic injustices. This program must transcend mere financial compensation, demanding structural reforms within U.S. institutions to dismantle systemic racism and promote transformative justice. This study explores potential forms of reparations, drawing upon historical precedents, comparative case studies from other nations, and contemporary debates within political philosophy and legal studies. The research employs both qualitative and quantitative methods. Qualitative methods include historical analysis of legal frameworks and policy documents, as well as discourse analysis of political rhetoric. Quantitative methods involve statistical analysis of socioeconomic data and criminal justice outcomes to expose racial disparities. This study makes a significant contribution to the existing literature on reparations, human rights, and racial injustice in the United States. It offers a rigorous analysis of the enduring consequences of historical oppression and advocates for bold, justice-centered solutions.Keywords: Black Americans, reparations, mass incarceration, racial injustice, human rights, united states
Procedia PDF Downloads 592010 Improving Pneumatic Artificial Muscle Performance Using Surrogate Model: Roles of Operating Pressure and Tube Diameter
Authors: Van-Thanh Ho, Jaiyoung Ryu
Abstract:
In soft robotics, the optimization of fluid dynamics through pneumatic methods plays a pivotal role in enhancing operational efficiency and reducing energy loss. This is particularly crucial when replacing conventional techniques such as cable-driven electromechanical systems. The pneumatic model employed in this study represents a sophisticated framework designed to efficiently channel pressure from a high-pressure reservoir to various muscle locations on the robot's body. This intricate network involves a branching system of tubes. The study introduces a comprehensive pneumatic model, encompassing the components of a reservoir, tubes, and Pneumatically Actuated Muscles (PAM). The development of this model is rooted in the principles of shock tube theory. Notably, the study leverages experimental data to enhance the understanding of the interplay between the PAM structure and the surrounding fluid. This improved interactive approach involves the use of morphing motion, guided by a contraction function. The study's findings demonstrate a high degree of accuracy in predicting pressure distribution within the PAM. The model's predictive capabilities ensure that the error in comparison to experimental data remains below a threshold of 10%. Additionally, the research employs a machine learning model, specifically a surrogate model based on the Kriging method, to assess and quantify uncertainty factors related to the initial reservoir pressure and tube diameter. This comprehensive approach enhances our understanding of pneumatic soft robotics and its potential for improved operational efficiency.Keywords: pneumatic artificial muscles, pressure drop, morhing motion, branched network, surrogate model
Procedia PDF Downloads 1002009 Using Chatbots to Create Situational Content for Coursework
Authors: B. Bricklin Zeff
Abstract:
This research explores the development and application of a specialized chatbot tailored for a nursing English course, with a primary objective of augmenting student engagement through situational content and responsiveness to key expressions and vocabulary. Introducing the chatbot, elucidating its purpose, and outlining its functionality are crucial initial steps in the research study, as they provide a comprehensive foundation for understanding the design and objectives of the specialized chatbot developed for the nursing English course. These elements establish the context for subsequent evaluations and analyses, enabling a nuanced exploration of the chatbot's impact on student engagement and language learning within the nursing education domain. The subsequent exploration of the intricate language model development process underscores the fusion of scientific methodologies and artistic considerations in this application of artificial intelligence (AI). Tailored for educators and curriculum developers in nursing, practical principles extending beyond AI and education are considered. Some insights into leveraging technology for enhanced language learning in specialized fields are addressed, with potential applications of similar chatbots in other professional English courses. The overarching vision is to illuminate how AI can transform language learning, rendering it more interactive and contextually relevant. The presented chatbot is a tangible example, equipping educators with a practical tool to enhance their teaching practices. Methodologies employed in this research encompass surveys and discussions to gather feedback on the chatbot's usability, effectiveness, and potential improvements. The chatbot system was integrated into a nursing English course, facilitating the collection of valuable feedback from participants. Significant findings from the study underscore the chatbot's effectiveness in encouraging more verbal practice of target expressions and vocabulary necessary for performance in role-play assessment strategies. This outcome emphasizes the practical implications of integrating AI into language education in specialized fields. This research holds significance for educators and curriculum developers in the nursing field, offering insights into integrating technology for enhanced English language learning. The study's major findings contribute valuable perspectives on the practical impact of the chatbot on student interaction and verbal practice. Ultimately, the research sheds light on the transformative potential of AI in making language learning more interactive and contextually relevant, particularly within specialized domains like nursing.Keywords: chatbot, nursing, pragmatics, role-play, AI
Procedia PDF Downloads 672008 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0
Authors: Harris Niavis, Dimitra Politaki
Abstract:
The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.Keywords: blockchain, data quality, industry4.0, product quality
Procedia PDF Downloads 1912007 Thermal Proprieties of Date Palm Wood
Authors: K. Almi, S. Lakel, A. Benchabane, A. Kriker
Abstract:
Several researches are focused on natural resources for the production of biomaterials intended for technical applications. Date palm wood present one of the world’s most important natural resource. Its use as insulating materials will help to solve the severe environmental and recycling problems which other artificial insulating materials caused. This paper reports the results of an experimental investigation on the thermal proprieties of date palm wood from Algeria. A study of physical, chemical, and mechanical properties is also carried out. The goal is to use this natural material in the manufacture of thermal insulation materials for buildings. The local natural resources used in this study are the date palm fibers from Biskra oasis in Algeria. The results have shown that there is no significant difference in the morphological proprieties of the four types of residues. Their chemical composition differed slightly; with the lowest amounts of cellulose and lignin content belong to Petiole. Water absorption study proved that Rachis has a low value of sorption whereas Petiole and Fibrillium have a high value of sorption what influenced their mechanical properties. It is seen that the Rachis and leaflets exhibit high tensile strength values compared to the other residue. On the other hand, the low value of the bulk density of Petiole and Fibrillium leads to a high value of specific tensile strength and young modulus. It was found that the specific young modulus of Petiole and Fibrillium was higher than that of Rachis and Leaflets and that of other natural fibers or even artificial fibers. Compared to the other materials date palm wood provide a good thermal proprieties thus, date palm wood will be a good candidate for the manufacturing efficient and safe insulating materials.Keywords: composite materials, date palm fiber, natural fibers, tensile tests, thermal proprieties
Procedia PDF Downloads 2972006 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication
Procedia PDF Downloads 2922005 The Effects of Computer Game-Based Pedagogy on Graduate Students Statistics Performance
Authors: Eva Laryea, Clement Yeboah Authors
Abstract:
A pretest-posttest within subjects, experimental design was employed to examine the effects of a computerized basic statistics learning game on achievement and statistics-related anxiety of students enrolled in introductory graduate statistics course. Participants (N = 34) were graduate students in a variety of programs at state-funded research university in the Southeast United States. We analyzed pre-test posttest differences using paired samples t-tests for achievement and for statistics anxiety. The results of the t-test for knowledge in statistics were found to be statistically significant indicating significant mean gains for statistical knowledge as a function of the game-based intervention. Likewise, the results of the t-test for statistics-related anxiety were also statistically significant indicating a decrease in anxiety from pretest to posttest. The implications of the present study are significant for both teachers and students. For teachers, using computer games developed by the researchers can help to create a more dynamic and engaging classroom environment, as well as improve student learning outcomes. For students, playing these educational games can help to develop important skills such as problem solving, critical thinking, and collaboration. Students can develop interest in the subject matter and spend quality time to learn the course as they play the game without knowing that they are even learning the presupposed hard course. The future directions of the present study are promising, as technology continues to advance and become more widely available. Some potential future developments include the integration of virtual and augmented reality into educational games, the use of machine learning and artificial intelligence to create personalized learning experiences, and the development of new and innovative game-based assessment tools. It is also important to consider the ethical implications of computer game-based pedagogy, such as the potential for games to perpetuate harmful stereotypes and biases. As the field continues to evolve, it will be crucial to address these issues and work towards creating inclusive and equitable learning experiences for all students. This study has the potential to revolutionize the way basic statistics graduate students learn and offers exciting opportunities for future development and research. It is an important area of inquiry for educators, researchers, and policymakers, and will continue to be a dynamic and rapidly evolving field for years to come.Keywords: pretest-posttest within subjects, experimental design, achievement, statistics-related anxiety
Procedia PDF Downloads 592004 Artificial Neural Network Modeling of a Closed Loop Pulsating Heat Pipe
Authors: Vipul M. Patel, Hemantkumar B. Mehta
Abstract:
Technological innovations in electronic world demand novel, compact, simple in design, less costly and effective heat transfer devices. Closed Loop Pulsating Heat Pipe (CLPHP) is a passive phase change heat transfer device and has potential to transfer heat quickly and efficiently from source to sink. Thermal performance of a CLPHP is governed by various parameters such as number of U-turns, orientations, input heat, working fluids and filling ratio. The present paper is an attempt to predict the thermal performance of a CLPHP using Artificial Neural Network (ANN). Filling ratio and heat input are considered as input parameters while thermal resistance is set as target parameter. Types of neural networks considered in the present paper are radial basis, generalized regression, linear layer, cascade forward back propagation, feed forward back propagation; feed forward distributed time delay, layer recurrent and Elman back propagation. Linear, logistic sigmoid, tangent sigmoid and Radial Basis Gaussian Function are used as transfer functions. Prediction accuracy is measured based on the experimental data reported by the researchers in open literature as a function of Mean Absolute Relative Deviation (MARD). The prediction of a generalized regression ANN model with spread constant of 4.8 is found in agreement with the experimental data for MARD in the range of ±1.81%.Keywords: ANN models, CLPHP, filling ratio, generalized regression, spread constant
Procedia PDF Downloads 2932003 A Survey on Intelligent Traffic Management with Cooperative Driving in Urban Roads
Authors: B. Karabuluter, O. Karaduman
Abstract:
Traffic management and traffic planning are important issues, especially in big cities. Due to the increase of personal vehicles and the physical constraints of urban roads, the problem of transportation especially in crowded cities over time is revealed. This situation reduces the living standards, and it can put human life at risk because the vehicles such as ambulance, fire department are prevented from reaching their targets. Even if the city planners take these problems into account, emergency planning and traffic management are needed to avoid cases such as traffic congestion, intersections, traffic jams caused by traffic accidents or roadworks. In this study, in smart traffic management issues, proposed solutions using intelligent vehicles acting in cooperation with urban roads are examined. Traffic management is becoming more difficult due to factors such as fatigue, carelessness, sleeplessness, social behavior patterns, and lack of education. However, autonomous vehicles, which remove the problems caused by human weaknesses by providing driving control, are increasing the success of practicing the algorithms developed in city traffic management. Such intelligent vehicles have become an important solution in urban life by using 'swarm intelligence' algorithms and cooperative driving methods to provide traffic flow, prevent traffic accidents, and increase living standards. In this study, studies conducted in this area have been dealt with in terms of traffic jam, intersections, regulation of traffic flow, signaling, prevention of traffic accidents, cooperation and communication techniques of vehicles, fleet management, transportation of emergency vehicles. From these concepts, some taxonomies were made out of the way. This work helps to develop new solutions and algorithms for cities where intelligent vehicles that can perform cooperative driving can take place, and at the same time emphasize the trend in this area.Keywords: intelligent traffic management, cooperative driving, smart driving, urban road, swarm intelligence, connected vehicles
Procedia PDF Downloads 3322002 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics
Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee
Abstract:
Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru
Procedia PDF Downloads 882001 Servitization in Machine and Plant Engineering: Leveraging Generative AI for Effective Product Portfolio Management Amidst Disruptive Innovations
Authors: Till Gramberg
Abstract:
In the dynamic world of machine and plant engineering, stagnation in the growth of new product sales compels companies to reconsider their business models. The increasing shift toward service orientation, known as "servitization," along with challenges posed by digitalization and sustainability, necessitates an adaptation of product portfolio management (PPM). Against this backdrop, this study investigates the current challenges and requirements of PPM in this industrial context and develops a framework for the application of generative artificial intelligence (AI) to enhance agility and efficiency in PPM processes. The research approach of this study is based on a mixed-method design. Initially, qualitative interviews with industry experts were conducted to gain a deep understanding of the specific challenges and requirements in PPM. These interviews were analyzed using the Gioia method, painting a detailed picture of the existing issues and needs within the sector. This was complemented by a quantitative online survey. The combination of qualitative and quantitative research enabled a comprehensive understanding of the current challenges in the practical application of machine and plant engineering PPM. Based on these insights, a specific framework for the application of generative AI in PPM was developed. This framework aims to assist companies in implementing faster and more agile processes, systematically integrating dynamic requirements from trends such as digitalization and sustainability into their PPM process. Utilizing generative AI technologies, companies can more quickly identify and respond to trends and market changes, allowing for a more efficient and targeted adaptation of the product portfolio. The study emphasizes the importance of an agile and reactive approach to PPM in a rapidly changing environment. It demonstrates how generative AI can serve as a powerful tool to manage the complexity of a diversified and continually evolving product portfolio. The developed framework offers practical guidelines and strategies for companies to improve their PPM processes by leveraging the latest technological advancements while maintaining ecological and social responsibility. This paper significantly contributes to deepening the understanding of the application of generative AI in PPM and provides a framework for companies to manage their product portfolios more effectively and adapt to changing market conditions. The findings underscore the relevance of continuous adaptation and innovation in PPM strategies and demonstrate the potential of generative AI for proactive and future-oriented business management.Keywords: servitization, product portfolio management, generative AI, disruptive innovation, machine and plant engineering
Procedia PDF Downloads 832000 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers
Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya
Abstract:
In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.Keywords: IVF, embryo, machine learning, time-lapse imaging data
Procedia PDF Downloads 941999 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution
Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda
Abstract:
This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation
Procedia PDF Downloads 1491998 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India
Authors: Anushtha Saxena
Abstract:
This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.Keywords: data monetization, e-commerce companies, regulatory framework, GDPR
Procedia PDF Downloads 1211997 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 1551996 Genetic Data of Deceased People: Solving the Gordian Knot
Authors: Inigo de Miguel Beriain
Abstract:
Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people
Procedia PDF Downloads 1551995 Machine Learning Techniques in Seismic Risk Assessment of Structures
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine
Procedia PDF Downloads 1061994 Systematic Review of Digital Interventions to Reduce the Carbon Footprint of Primary Care
Authors: Anastasia Constantinou, Panayiotis Laouris, Stephen Morris
Abstract:
Background: Climate change has been reported as one of the worst threats to healthcare. The healthcare sector is a significant contributor to greenhouse gas emissions with primary care being responsible for 23% of the NHS’ total carbon footprint. Digital interventions, primarily focusing on telemedicine, offer a route to change. This systematic review aims to quantify and characterize the carbon footprint savings associated with the implementation of digital interventions in the setting of primary care. Methods: A systematic review of published literature was conducted according to PRISMA (Preferred Reporting Item for Systematic Reviews and Meta-Analyses) guidelines. MEDLINE, PubMed, and Scopus databases as well as Google scholar were searched using key terms relating to “carbon footprint,” “environmental impact,” “sustainability”, “green care”, “primary care,”, and “general practice,” using citation tracking to identify additional articles. Data was extracted and analyzed in Microsoft Excel. Results: Eight studies were identified conducted in four different countries between 2010 and 2023. Four studies used interventions to address primary care services, three studies focused on the interface between primary and specialist care, and one study addressed both. Digital interventions included the use of mobile applications, online portals, access to electronic medical records, electronic referrals, electronic prescribing, video-consultations and use of autonomous artificial intelligence. Only one study carried out a complete life cycle assessment to determine the carbon footprint of the intervention. It estimate that digital interventions reduced the carbon footprint at primary care level by 5.1 kgCO2/visit, and at the interface with specialist care by 13.4 kg CO₂/visit. When assessing the relationship between travel-distance saved and savings in emissions, we identified a strong correlation, suggesting that most of the carbon footprint reduction is attributed to reduced travel. However, two studies also commented on environmental savings associated with reduced use of paper. Patient savings in the form of reduced fuel cost and reduced travel time were also identified. Conclusion: All studies identified significant reductions in carbon footprint following implementation of digital interventions. In the future, controlled, prospective studies incorporating complete life cycle assessments and accounting for double-consulting effects, use of additional resources, technical failures, quality of care and cost-effectiveness are needed to fully appreciate the sustainable benefit of these interventionsKeywords: carbon footprint, environmental impact, primary care, sustainable healthcare
Procedia PDF Downloads 63