Search results for: legal judgment prediction
2734 The Accuracy of Measures for Screening Adults for Spiritual Suffering in Health Care Settings: A Systematic Review
Authors: Sayna Bahraini, Wendy Gifford, Ian Graham, Liquaa Wazni, Suzettee Bremault-Phillips, Rebekah Hackbusch, Catrine Demers, Mary Egan
Abstract:
Objective: Guidelines for palliative and spiritual care emphasize the importance of screening patients for spiritual suffering. The aim of this review was to synthesize the research evidence on the accuracy of measures used to screen adults for spiritual suffering. Methods: A systematic review has been conducted. We searched five scientific databases to identify relevant articles. Two independent reviewers screened extracted data and assessed study methodological quality. Results: We identified five articles that yielded information on 24 spiritual screening measures. Among all identified measures, the 2-item Meaning/Joy & Self-Described Struggle has the highest sensitivity (82-87%), and the revised Rush protocol has the highest specificity (81-90%). The methodological quality of all included studies was low. Significance of Results: While most of the identified spiritual screening measures are brief (comprise 1 to 12 number of items), few have sufficient accuracy to effectively screen patients for spiritual suffering. We advise clinicians to use their critical appraisal skills and clinical judgment when selecting and using any of the identified measures to screen for spiritual suffering.Keywords: screening, suffering, spirituality, diagnostic test accuracy, systematic review
Procedia PDF Downloads 1432733 Agreement between Basal Metabolic Rate Measured by Bioelectrical Impedance Analysis and Estimated by Prediction Equations in Obese Groups
Authors: Orkide Donma, Mustafa M. Donma
Abstract:
Basal metabolic rate (BMR) is widely used and an accepted measure of energy expenditure. Its principal determinant is body mass. However, this parameter is also correlated with a variety of other factors. The objective of this study is to measure BMR and compare it with the values obtained from predictive equations in adults classified according to their body mass index (BMI) values. 276 adults were included into the scope of this study. Their age, height and weight values were recorded. Five groups were designed based on their BMI values. First group (n = 85) was composed of individuals with BMI values varying between 18.5 and 24.9 kg/m2. Those with BMI values varying from 25.0 to 29.9 kg/m2 constituted Group 2 (n = 90). Individuals with 30.0-34.9 kg/m2, 35.0-39.9 kg/m2, > 40.0 kg/m2 were included in Group 3 (n = 53), 4 (n = 28) and 5 (n = 20), respectively. The most commonly used equations to be compared with the measured BMR values were selected. For this purpose, the values were calculated by the use of four equations to predict BMR values, by name, introduced by Food and Agriculture Organization (FAO)/World Health Organization (WHO)/United Nations University (UNU), Harris and Benedict, Owen and Mifflin. Descriptive statistics, ANOVA, post-Hoc Tukey and Pearson’s correlation tests were performed by a statistical program designed for Windows (SPSS, version 16.0). p values smaller than 0.05 were accepted as statistically significant. Mean ± SD of groups 1, 2, 3, 4 and 5 for measured BMR in kcal were 1440.3 ± 210.0, 1618.8 ± 268.6, 1741.1 ± 345.2, 1853.1 ± 351.2 and 2028.0 ± 412.1, respectively. Upon evaluation of the comparison of means among groups, differences were highly significant between Group 1 and each of the remaining four groups. The values were increasing from Group 2 to Group 5. However, differences between Group 2 and Group 3, Group 3 and Group 4, Group 4 and Group 5 were not statistically significant. These insignificances were lost in predictive equations proposed by Harris and Benedict, FAO/WHO/UNU and Owen. For Mifflin, the insignificance was limited only to Group 4 and Group 5. Upon evaluation of the correlations of measured BMR and the estimated values computed from prediction equations, the lowest correlations between measured BMR and estimated BMR values were observed among the individuals within normal BMI range. The highest correlations were detected in individuals with BMI values varying between 30.0 and 34.9 kg/m2. Correlations between measured BMR values and BMR values calculated by FAO/WHO/UNU as well as Owen were the same and the highest. In all groups, the highest correlations were observed between BMR values calculated from Mifflin and Harris and Benedict equations using age as an additional parameter. In conclusion, the unique resemblance of the FAO/WHO/UNU and Owen equations were pointed out. However, mean values obtained from FAO/WHO/UNU were much closer to the measured BMR values. Besides, the highest correlations were found between BMR calculated from FAO/WHO/UNU and measured BMR. These findings suggested that FAO/WHO/UNU was the most reliable equation, which may be used in conditions when the measured BMR values are not available.Keywords: adult, basal metabolic rate, fao/who/unu, obesity, prediction equations
Procedia PDF Downloads 1332732 A New DIDS Design Based on a Combination Feature Selection Approach
Authors: Adel Sabry Eesa, Adnan Mohsin Abdulazeez Brifcani, Zeynep Orman
Abstract:
Feature selection has been used in many fields such as classification, data mining and object recognition and proven to be effective for removing irrelevant and redundant features from the original data set. In this paper, a new design of distributed intrusion detection system using a combination feature selection model based on bees and decision tree. Bees algorithm is used as the search strategy to find the optimal subset of features, whereas decision tree is used as a judgment for the selected features. Both the produced features and the generated rules are used by Decision Making Mobile Agent to decide whether there is an attack or not in the networks. Decision Making Mobile Agent will migrate through the networks, moving from node to another, if it found that there is an attack on one of the nodes, it then alerts the user through User Interface Agent or takes some action through Action Mobile Agent. The KDD Cup 99 data set is used to test the effectiveness of the proposed system. The results show that even if only four features are used, the proposed system gives a better performance when it is compared with the obtained results using all 41 features.Keywords: distributed intrusion detection system, mobile agent, feature selection, bees algorithm, decision tree
Procedia PDF Downloads 4112731 Hansen Solubility Parameter from Surface Measurements
Authors: Neveen AlQasas, Daniel Johnson
Abstract:
Membranes for water treatment are an established technology that attracts great attention due to its simplicity and cost effectiveness. However, membranes in operation suffer from the adverse effect of membrane fouling. Bio-fouling is a phenomenon that occurs at the water-membrane interface, and is a dynamic process that is initiated by the adsorption of dissolved organic material, including biomacromolecules, on the membrane surface. After initiation, attachment of microorganisms occurs, followed by biofilm growth. The biofilm blocks the pores of the membrane and consequently results in reducing the water flux. Moreover, the presence of a fouling layer can have a substantial impact on the membrane separation properties. Understanding the mechanism of the initiation phase of biofouling is a key point in eliminating the biofouling on membrane surfaces. The adhesion and attachment of different fouling materials is affected by the surface properties of the membrane materials. Therefore, surface properties of different polymeric materials had been studied in terms of their surface energies and Hansen solubility parameters (HSP). The difference between the combined HSP parameters (HSP distance) allows prediction of the affinity of two materials to each other. The possibilities of measuring the HSP of different polymer films via surface measurements, such as contact angle has been thoroughly investigated. Knowing the HSP of a membrane material and the HSP of a specific foulant, facilitate the estimation of the HSP distance between the two, and therefore the strength of attachment to the surface. Contact angle measurements using fourteen different solvents on five different polymeric films were carried out using the sessile drop method. Solvents were ranked as good or bad solvents using different ranking method and ranking was used to calculate the HSP of each polymeric film. Results clearly indicate the absence of a direct relation between contact angle values of each film and the HSP distance between each polymer film and the solvents used. Therefore, estimating HSP via contact angle alone is not sufficient. However, it was found if the surface tensions and viscosities of the used solvents are taken in to the account in the analysis of the contact angle values, a prediction of the HSP from contact angle measurements is possible. This was carried out via training of a neural network model. The trained neural network model has three inputs, contact angle value, surface tension and viscosity of solvent used. The model is able to predict the HSP distance between the used solvent and the tested polymer (material). The HSP distance prediction is further used to estimate the total and individual HSP parameters of each tested material. The results showed an accuracy of about 90% for all the five studied filmsKeywords: surface characterization, hansen solubility parameter estimation, contact angle measurements, artificial neural network model, surface measurements
Procedia PDF Downloads 942730 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom
Authors: Chih-Ping Chang
Abstract:
Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner
Procedia PDF Downloads 2912729 Unshackled Slaves: An Analysis of the Adjudication of Degrading Conditions of Work by Brazilian Labour Courts
Authors: Aline F. C. Pereira
Abstract:
In recent years, modern slavery has increasingly gathered attention in scholarly discussions and policy debates. Whereas the mainstream studies focus on forced labour and trafficking, little attention is paid to other forms of exploitation, such as degrading conditions of work –criminalised in Brazil as an autonomous type of slavery since 2003. This paper aims to bridge this gap. It adopts a mixed method that comprises both qualitative and quantitative analysis, to investigate the adjudication of 164 cases of degrading conditions of work by Brazilian labour courts. The research discloses an ungrounded reluctance to apply the domestic legal framework, as in most of the cases degrading conditions of work are not recognised as contemporary slavery, despite the law. In some cases, not even situations described as subhuman and degrading of human dignity were framed as slavery. The analysis also suggests that, as in chattel times, lack of freedom and subjection remain relevant in the legal characterisation of slave labour. The examination has further unraveled a phenomenon absent in previous studies: normalisation of precarity. By depicting precarity as natural and inevitable in rural areas, labour courts ensure conformity to the status quo and reduce the likelihood of resistance by victims. Moreover, compensations afforded to urban workers are higher than granted to rural employees, which seems to place human beings in hierarchical categories -a trace of colonialism. In sum, the findings challenge the worldwide spread assumption that Brazil addresses slavery efficiently. Conversely, the Brazilian Labour Judiciary seems to remain subservient to a colonial perspective of slavery, legitimising, and sanctioning abusive practices.Keywords: adjudication, contemporary slavery, degrading conditions of work, normalisation of precarity
Procedia PDF Downloads 1142728 Study of the Persian Gulf’s and Oman Sea’s Numerical Tidal Currents
Authors: Fatemeh Sadat Sharifi
Abstract:
In this research, a barotropic model was employed to consider the tidal studies in the Persian Gulf and Oman Sea, where the only sufficient force was the tidal force. To do that, a finite-difference, free-surface model called Regional Ocean Modeling System (ROMS), was employed on the data over the Persian Gulf and Oman Sea. To analyze flow patterns of the region, the results of limited size model of The Finite Volume Community Ocean Model (FVCOM) were appropriated. The two points were determined since both are one of the most critical water body in case of the economy, biology, fishery, Shipping, navigation, and petroleum extraction. The OSU Tidal Prediction Software (OTPS) tide and observation data validated the modeled result. Next, tidal elevation and speed, and tidal analysis were interpreted. Preliminary results determine a significant accuracy in the tidal height compared with observation and OTPS data, declaring that tidal currents are highest in Hormuz Strait and the narrow and shallow region between Iranian coasts and Islands. Furthermore, tidal analysis clarifies that the M_2 component has the most significant value. Finally, the Persian Gulf tidal currents are divided into two branches: the first branch converts from south to Qatar and via United Arab Emirate rotates to Hormuz Strait. The secondary branch, in north and west, extends up to the highest point in the Persian Gulf and in the head of Gulf turns counterclockwise.Keywords: numerical model, barotropic tide, tidal currents, OSU tidal prediction software, OTPS
Procedia PDF Downloads 1332727 Medical Error: Concept and Description According to Brazilian Physicians
Authors: Vitor S. Mendonca, Maria Luisa S. Schmidt
Abstract:
The Brazilian medical profession is viewed as being error-free, so healthcare professionals who commit an error are condemned there. Medical errors occur frequently in the Brazilian healthcare system, so identifying better options for handling this issue has become of interest primarily for physicians. The purpose of this study is to better understand the tensions involved in the fear of making an error due to the harm and risk this would represent for those involved. A qualitative study was performed by means of the narratives of the lived experiences of ten acting physicians in the State of Sao Paulo. The concept and characterization of errors were discussed, together with the fear of making an error, the near misses or error in itself, how to deal with errors and what to do to avoid them. The analysis indicates an excessive pressure in the medical profession for error-free practices, with a well-established physician-patient relationship to facilitate the management of medical errors. The error occurs, but a lack of information and discussion often leads to its concealment due to fear or possible judgment by society or peers. The establishment of programs that encourage appropriate medical conduct in the event of an error requires coherent answers for humanization in Brazilian medical science. It is necessary to improve the discussion about medical errors and disseminate models of communication and notification of errors in Brazil.Keywords: medical error, narrative, physician-patient relationship, qualitative research
Procedia PDF Downloads 1792726 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1092725 Using Wearable Device with Neuron Network to Classify Severity of Sleep Disorder
Authors: Ru-Yin Yang, Chi Wu, Cheng-Yu Tsai, Yin-Tzu Lin, Wen-Te Liu
Abstract:
Background: Sleep breathing disorder (SDB) is a condition demonstrated by recurrent episodes of the airway obstruction leading to intermittent hypoxia and quality fragmentation during sleep time. However, the procedures for SDB severity examination remain complicated and costly. Objective: The objective of this study is to establish a simplified examination method for SDB by the respiratory impendence pattern sensor combining the signal processing and machine learning model. Methodologies: We records heart rate variability by the electrocardiogram and respiratory pattern by impendence. After the polysomnography (PSG) been done with the diagnosis of SDB by the apnea and hypopnea index (AHI), we calculate the episodes with the absence of flow and arousal index (AI) from device record. Subjects were divided into training and testing groups. Neuron network was used to establish a prediction model to classify the severity of the SDB by the AI, episodes, and body profiles. The performance was evaluated by classification in the testing group compared with PSG. Results: In this study, we enrolled 66 subjects (Male/Female: 37/29; Age:49.9±13.2) with the diagnosis of SDB in a sleep center in Taipei city, Taiwan, from 2015 to 2016. The accuracy from the confusion matrix on the test group by NN is 71.94 %. Conclusion: Based on the models, we established a prediction model for SDB by means of the wearable sensor. With more cases incoming and training, this system may be used to rapidly and automatically screen the risk of SDB in the future.Keywords: sleep breathing disorder, apnea and hypopnea index, body parameters, neuron network
Procedia PDF Downloads 1502724 Using Analytic Hierarchy Process as a Decision-Making Tool in Project Portfolio Management
Authors: Darius Danesh, Michael J. Ryan, Alireza Abbasi
Abstract:
Project Portfolio Management (PPM) is an essential component of an organisation’s strategic procedures, which requires attention of several factors to envisage a range of long-term outcomes to support strategic project portfolio decisions. To evaluate overall efficiency at the portfolio level, it is essential to identify the functionality of specific projects as well as to aggregate those findings in a mathematically meaningful manner that indicates the strategic significance of the associated projects at a number of levels of abstraction. PPM success is directly associated with the quality of decisions made and poor judgment increases portfolio costs. Hence, various Multi-Criteria Decision Making (MCDM) techniques have been designed and employed to support the decision-making functions. This paper reviews possible option to improve the decision-making outcomes in the organisational portfolio management processes using the Analytic Hierarchy Process (AHP) both from academic and practical perspectives and will examine the usability, certainty and quality of the technique. The results of the study will also provide insight into the technical risk associated with current decision-making model to underpin initiative tracking and strategic portfolio management.Keywords: analytic hierarchy process, decision support systems, multi-criteria decision making, project portfolio management
Procedia PDF Downloads 3212723 Comparative Study to Evaluate the Efficacy of Control Criterion in Determining Consolidation Scope in the Public Sector
Authors: Batool Zarei
Abstract:
This study aims to answer this question whether control criterion with two elements of power and benefit which is introduced as 'control criterion of consolidation scope' in national and international standards of accounting in public sector (and also private sector) is efficient enough or not. The methodology of this study is comparative and the results of this research are significantly generalizable, due to the given importance to the sample of countries which were studied. Findings of this study states that in spite of pervasive use of control criterion (including 2 elements of power and benefit), criteria for determining the existence of control in public sector accounting standards, are not efficient enough to determine the consolidation scope of whole of government financial statements in a way that meet decision making and accountability needs of managers, policy makers and supervisors; specially parliament. Therefore, the researcher believes that for determining consolidation scope in public sector, in addition to economic view, it is better to pay attention to budgetary, legal and statistical concepts and also to practical and financial risk and define indicators for proving the existence of control (power and benefit) which include accountability relationships (budgetary relation, legal form and nature of activity). these findings also reveals the necessity of passing a comprehensive public financial management (PFM) legislation in order to redefine the characteristics of public sector entities and whole of government financial statements scope and review Statistics organizations and central banks duties for preparing government financial statistics and national accounts in order to achieve sustainable development and resilient economy goals.Keywords: control, consolidation scope, public sector accounting, government financial statistics, resilient economy
Procedia PDF Downloads 2602722 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions
Authors: Joel Niklaus, Matthias Sturmer
Abstract:
The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling
Procedia PDF Downloads 1482721 Predicting Personality and Psychological Distress Using Natural Language Processing
Authors: Jihee Jang, Seowon Yoon, Gaeun Son, Minjung Kang, Joon Yeon Choeh, Kee-Hong Choi
Abstract:
Background: Self-report multiple choice questionnaires have been widely utilized to quantitatively measure one’s personality and psychological constructs. Despite several strengths (e.g., brevity and utility), self-report multiple-choice questionnaires have considerable limitations in nature. With the rise of machine learning (ML) and Natural language processing (NLP), researchers in the field of psychology are widely adopting NLP to assess psychological constructs to predict human behaviors. However, there is a lack of connections between the work being performed in computer science and that psychology due to small data sets and unvalidated modeling practices. Aims: The current article introduces the study method and procedure of phase II, which includes the interview questions for the five-factor model (FFM) of personality developed in phase I. This study aims to develop the interview (semi-structured) and open-ended questions for the FFM-based personality assessments, specifically designed with experts in the field of clinical and personality psychology (phase 1), and to collect the personality-related text data using the interview questions and self-report measures on personality and psychological distress (phase 2). The purpose of the study includes examining the relationship between natural language data obtained from the interview questions, measuring the FFM personality constructs, and psychological distress to demonstrate the validity of the natural language-based personality prediction. Methods: The phase I (pilot) study was conducted on fifty-nine native Korean adults to acquire the personality-related text data from the interview (semi-structured) and open-ended questions based on the FFM of personality. The interview questions were revised and finalized with the feedback from the external expert committee, consisting of personality and clinical psychologists. Based on the established interview questions, a total of 425 Korean adults were recruited using a convenience sampling method via an online survey. The text data collected from interviews were analyzed using natural language processing. The results of the online survey, including demographic data, depression, anxiety, and personality inventories, were analyzed together in the model to predict individuals’ FFM of personality and the level of psychological distress (phase 2).Keywords: personality prediction, psychological distress prediction, natural language processing, machine learning, the five-factor model of personality
Procedia PDF Downloads 792720 A Prediction Model for Dynamic Responses of Building from Earthquake Based on Evolutionary Learning
Authors: Kyu Jin Kim, Byung Kwan Oh, Hyo Seon Park
Abstract:
The seismic responses-based structural health monitoring system has been performed to prevent seismic damage. Structural seismic damage of building is caused by the instantaneous stress concentration which is related with dynamic characteristic of earthquake. Meanwhile, seismic response analysis to estimate the dynamic responses of building demands significantly high computational cost. To prevent the failure of structural members from the characteristic of the earthquake and the significantly high computational cost for seismic response analysis, this paper presents an artificial neural network (ANN) based prediction model for dynamic responses of building considering specific time length. Through the measured dynamic responses, input and output node of the ANN are formed by the length of specific time, and adopted for the training. In the model, evolutionary radial basis function neural network (ERBFNN), that radial basis function network (RBFN) is integrated with evolutionary optimization algorithm to find variables in RBF, is implemented. The effectiveness of the proposed model is verified through an analytical study applying responses from dynamic analysis for multi-degree of freedom system to training data in ERBFNN.Keywords: structural health monitoring, dynamic response, artificial neural network, radial basis function network, genetic algorithm
Procedia PDF Downloads 3042719 Mix Proportioning and Strength Prediction of High Performance Concrete Including Waste Using Artificial Neural Network
Authors: D. G. Badagha, C. D. Modhera, S. A. Vasanwala
Abstract:
There is a great challenge for civil engineering field to contribute in environment prevention by finding out alternatives of cement and natural aggregates. There is a problem of global warming due to cement utilization in concrete, so it is necessary to give sustainable solution to produce concrete containing waste. It is very difficult to produce designated grade of concrete containing different ingredient and water cement ratio including waste to achieve desired fresh and harden properties of concrete as per requirement and specifications. To achieve the desired grade of concrete, a number of trials have to be taken, and then after evaluating the different parameters at long time performance, the concrete can be finalized to use for different purposes. This research work is carried out to solve the problem of time, cost and serviceability in the field of construction. In this research work, artificial neural network introduced to fix proportion of concrete ingredient with 50% waste replacement for M20, M25, M30, M35, M40, M45, M50, M55 and M60 grades of concrete. By using the neural network, mix design of high performance concrete was finalized, and the main basic mechanical properties were predicted at 3 days, 7 days and 28 days. The predicted strength was compared with the actual experimental mix design and concrete cube strength after 3 days, 7 days and 28 days. This experimentally and neural network based mix design can be used practically in field to give cost effective, time saving, feasible and sustainable high performance concrete for different types of structures.Keywords: artificial neural network, high performance concrete, rebound hammer, strength prediction
Procedia PDF Downloads 1582718 Localization of Geospatial Events and Hoax Prediction in the UFO Database
Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi
Abstract:
Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events
Procedia PDF Downloads 3782717 A Comparative Study of Self, Peer and Teacher Assessment Based on an English Writing Checklist
Authors: Xiaoting Shi, Xiaomei Ma
Abstract:
In higher education, students' self-assessment and peer assessment of compositions in writing classes can effectively improve their ability of evaluative judgment. However, students' self-assessment and peer assessment are not advocated by most teachers because of the significant difference in scoring compared with teacher assessment. This study used a multi-faceted Rasch model to explore whether an English writing checklist containing 30 descriptors can effectively improve rating consistency among self-assessment, peer assessment and teacher assessment. Meanwhile, a questionnaire was adopted to survey students’ and teachers’ attitudes toward self-assessment and peer assessment using the writing checklist. Results of the multi-faceted Rasch model analysis show that the writing checklist can effectively distinguish the students’ writing ability (separate coefficient = 2.05, separate reliability = 0.81, chi-square value (df = 32) = 123.4). Moreover, the results revealed that the checklist could improve rating consistency among self-assessment, peer assessment and teacher assessment. (separate coefficient = 1.71, separate reliability = 0.75, chi-square value (df=4) = 20.8). The results of the questionnaire showed that more than 85% of students and all teachers believed that the checklist had a good advantage in self-assessment and peer assessment, and they were willing to use the checklist to conduct self-assessment and peer assessment in class in the future.Keywords: english writing, self-assessment, peer assessment, writing checklist
Procedia PDF Downloads 1552716 In silico Analysis of a Causative Mutation in Cadherin-23 Gene Identified in an Omani Family with Hearing Loss
Authors: Mohammed N. Al Kindi, Mazin Al Khabouri, Khalsa Al Lamki, Tommasso Pappuci, Giovani Romeo, Nadia Al Wardy
Abstract:
Hereditary hearing loss is a heterogeneous group of complex disorders with an overall incidence of one in every five hundred newborns presented as syndromic and non-syndromic forms. Cadherin-related 23 (CDH23) is one of the listed deafness causative genes. CDH23 is found to be expressed in the stereocilia of hair cells and the retina photoreceptor cells. Defective CDH23 has been associated mostly with prelingual severe-to-profound sensorineural hearing loss (SNHL) in either syndromic (USH1D) or non-syndromic SNHL (DFNB12). An Omani family diagnosed clinically with severe-profound sensorineural hearing loss was genetically analysed by whole exome sequencing technique. A novel homozygous missense variant, c.A7451C (p.D2484A), in exon 53 of CDH23 was detected. One hundred and thirty control samples were analysed where all were negative for the detected variant. The variant was analysed in silico for pathogenicity verification using several mutation prediction software. The variant proved to be a pathogenic mutation and is reported for the first time in Oman and worldwide. It is concluded that in silico mutation prediction analysis might be used as a useful molecular diagnostics tool benefiting both genetic counseling and mutation verification. The aspartic acid 2484 alanine missense substitution might be the main disease-causing mutation that damages CDH23 function and could be used as a genetic hearing loss marker for this particular Omani family.Keywords: Cdh23, d2484a, in silico, Oman
Procedia PDF Downloads 2182715 A Comprehensive Review of Artificial Intelligence Applications in Sustainable Building
Authors: Yazan Al-Kofahi, Jamal Alqawasmi.
Abstract:
In this study, a comprehensive literature review (SLR) was conducted, with the main goal of assessing the existing literature about how artificial intelligence (AI), machine learning (ML), deep learning (DL) models are used in sustainable architecture applications and issues including thermal comfort satisfaction, energy efficiency, cost prediction and many others issues. For this reason, the search strategy was initiated by using different databases, including Scopus, Springer and Google Scholar. The inclusion criteria were used by two research strings related to DL, ML and sustainable architecture. Moreover, the timeframe for the inclusion of the papers was open, even though most of the papers were conducted in the previous four years. As a paper filtration strategy, conferences and books were excluded from database search results. Using these inclusion and exclusion criteria, the search was conducted, and a sample of 59 papers was selected as the final included papers in the analysis. The data extraction phase was basically to extract the needed data from these papers, which were analyzed and correlated. The results of this SLR showed that there are many applications of ML and DL in Sustainable buildings, and that this topic is currently trendy. It was found that most of the papers focused their discussions on addressing Environmental Sustainability issues and factors using machine learning predictive models, with a particular emphasis on the use of Decision Tree algorithms. Moreover, it was found that the Random Forest repressor demonstrates strong performance across all feature selection groups in terms of cost prediction of the building as a machine-learning predictive model.Keywords: machine learning, deep learning, artificial intelligence, sustainable building
Procedia PDF Downloads 672714 Identifying Reforms Required in Construction Contracts from Resolved Disputed Cases
Authors: K. C. Iyer, Yogita Manan Bindal, Sumit Kumar Bakshi
Abstract:
The construction industry is plagued with disputes and litigation in India with many stalled projects seeking dispute resolution. This has an adverse effect on the performance and overall project delivery and impacts future investments within the industry. While construction industry is the major driver of growth, there has not been major reforms in the government construction contracts. The study is aimed at identifying the proactive means of dispute avoidance, focusing on reforms required within the construction contracts, by studying 49 arbitration awards of construction disputes. The claims presented in the awards are aggregated to study the causes linked to the contract document and are referred against the prospective recommendation and practices as surveyed from literature review of research papers. Within contract administration, record keeping has been a major concern as they are required by the parties to substantiate the claims or the counterclaims and therefore are essential in any dispute redressal process. The study also observes that the right judgment is inhibited when the record keeping is improper and due to lack of coherence between documents, the dispute resolution period is also prolonged. The finding of the research will be relevant to industry practitioners in contract drafting with a view to avoid disputes.Keywords: construction contract, contract administration, contract management, dispute avoidance
Procedia PDF Downloads 2662713 Artificial Neural Network Approach for Modeling Very Short-Term Wind Speed Prediction
Authors: Joselito Medina-Marin, Maria G. Serna-Diaz, Juan C. Seck-Tuoh-Mora, Norberto Hernandez-Romero, Irving Barragán-Vite
Abstract:
Wind speed forecasting is an important issue for planning wind power generation facilities. The accuracy in the wind speed prediction allows a good performance of wind turbines for electricity generation. A model based on artificial neural networks is presented in this work. A dataset with atmospheric information about air temperature, atmospheric pressure, wind direction, and wind speed in Pachuca, Hidalgo, México, was used to train the artificial neural network. The data was downloaded from the web page of the National Meteorological Service of the Mexican government. The records were gathered for three months, with time intervals of ten minutes. This dataset was used to develop an iterative algorithm to create 1,110 ANNs, with different configurations, starting from one to three hidden layers and every hidden layer with a number of neurons from 1 to 10. Each ANN was trained with the Levenberg-Marquardt backpropagation algorithm, which is used to learn the relationship between input and output values. The model with the best performance contains three hidden layers and 9, 6, and 5 neurons, respectively; and the coefficient of determination obtained was r²=0.9414, and the Root Mean Squared Error is 1.0559. In summary, the ANN approach is suitable to predict the wind speed in Pachuca City because the r² value denotes a good fitting of gathered records, and the obtained ANN model can be used in the planning of wind power generation grids.Keywords: wind power generation, artificial neural networks, wind speed, coefficient of determination
Procedia PDF Downloads 1252712 Commercial Law Between Custom and Islamic Law
Authors: Mohamed Zakareia Ghazy Aly Belal
Abstract:
Commercial law is the set of legal rules that apply to business and regulates the trade of trade. The meaning of this is that the commercial law regulates certain relations only that arises as a result of carrying out certain businesses. which are business, as it regulates the activity of a specific sect, the sect of merchants, and the commercial law as other branches of the law has characteristics that distinguish it from other laws and various, and various sources from which its basis is derived from It is the objective or material source. the historical source, the official source and the interpretative source, and we are limited to official sources and explanatory sources. so what do you see what these sources are, and what is their degree and strength in taking it in commercial disputes. The first topic / characteristics of commercial law. Commercial law has become necessary for the world of trade and economics, which cannot be dispensed with, given the reasons that have been set as legal rules for commercial field. In fact, it is sufficient to refer to the stability and stability of the environment, and in exchange for the movement and the speed in which the commercial environment is in addition to confidence and credit. the characteristic of speed and the characteristic of trust, and credit are the ones that justify the existence of commercial law. Business is fast, while civil business is slow, stable and stability. The person concludes civil transactions in his life only a little. And before doing any civil action. he must have a period of thinking and scrutiny, and the investigation is the person who wants the husband, he must have a period of thinking and scrutiny. as if the person who wants to acquire a house to live with with his family, he must search and investigate Discuss the price before the conclusion of a purchase contract. In the commercial field, transactions take place very quickly because the time factor has an important role in concluding deals and achieving profits. This is because the merchant in contracting about a specific deal would cause a loss to the merchant due to the linkage of the commercial law with the fluctuations of the economy and the market. The merchant may also conclude more than one deal in one and short time. And that is due to the absence of commercial law from the formalities and procedures that hinder commercial transactions.Keywords: law, commercial law, business, commercial field
Procedia PDF Downloads 732711 Development of a Practical Screening Measure for the Prediction of Low Birth Weight and Neonatal Mortality in Upper Egypt
Authors: Prof. Ammal Mokhtar Metwally, Samia M. Sami, Nihad A. Ibrahim, Fatma A. Shaaban, Iman I. Salama
Abstract:
Objectives: Reducing neonatal mortality by 2030 is still a challenging goal in developing countries. low birth weight (LBW) is a significant contributor to this, especially where weighing newborns is not possible routinely. The present study aimed to determine a simple, easy, reliable anthropometric measure(s) that can predict LBW) and neonatal mortality. Methods: A prospective cohort study of 570 babies born in districts of El Menia governorate, Egypt (where most deliveries occurred at home) was examined at birth. Newborn weight, length, head, chest, mid-arm, and thigh circumferences were measured. Follow up of the examined neonates took place during their first four weeks of life to report any mortalities. The most predictable anthropometric measures were determined using the statistical package of SPSS, and multiple Logistic regression analysis was performed.: Results: Head and chest circumferences with cut-off points < 33 cm and ≤ 31.5 cm, respectively, were the significant predictors for LBW. They carried the best combination of having the highest sensitivity (89.8 % & 86.4 %) and least false negative predictive value (1.4 % & 1.7 %). Chest circumference with a cut-off point ≤ 31.5 cm was the significant predictor for neonatal mortality with 83.3 % sensitivity and 0.43 % false negative predictive value. Conclusion: Using chest circumference with a cut-off point ≤ 31.5 cm is recommended as a single simple anthropometric measurement for the prediction of both LBW and neonatal mortality. The predicted measure could act as a substitute for weighting newborns in communities where scales to weigh them are not routinely available.Keywords: low birth weight, neonatal mortality, anthropometric measures, practical screening
Procedia PDF Downloads 1012710 Temporal and Spatial Distribution Prediction of Patinopecten yessoensis Larvae in Northern China Yellow Sea
Authors: RuiJin Zhang, HengJiang Cai, JinSong Gui
Abstract:
It takes Patinopecten yessoensis larvae more than 20 days from spawning to settlement. Due to the natural environmental factors such as current, Patinopecten yessoensis larvae are transported to a distance more than hundreds of kilometers, leading to a high instability of their spatial and temporal distribution and great difficulties in the natural spat collection. Therefore predicting the distribution is of great significance to improve the operating efficiency of the collecting. Hydrodynamic model of Northern China Yellow Sea was established and the motions equations of physical oceanography and verified by the tidal harmonic constants and the measured data velocities of Dalian Bay. According to the passivity drift characteristics of the larvae, combined with the hydrodynamic model and the particle tracking model, the spatial and temporal distribution prediction model was established and the spatial and temporal distribution of the larvae under the influence of flow and wind were simulated. It can be concluded from the model results: ocean currents have greatest impacts on the passive drift path and diffusion of Patinopecten yessoensis larvae; the impact of wind is also important, which changed the direction and speed of the drift. Patinopecten yessoensis larvae were generated in the sea along Zhangzi Island and Guanglu-Dachangshan Island, but after two months, with the impact of wind and currents, the larvae appeared in the west of Dalian and the southern of Lvshun, and even in Bohai Bay. The model results are consistent with the relevant literature on qualitative analysis, and this conclusion explains where the larvae come from in the perspective of numerical simulation.Keywords: numerical simulation, Patinopecten yessoensis larvae, predicting model, spatial and temporal distribution
Procedia PDF Downloads 3052709 A Three Elements Vector Valued Structure’s Ultimate Strength-Strong Motion-Intensity Measure
Authors: A. Nicknam, N. Eftekhari, A. Mazarei, M. Ganjvar
Abstract:
This article presents an alternative collapse capacity intensity measure in the three elements form which is influenced by the spectral ordinates at periods longer than that of the first mode period at near and far source sites. A parameter, denoted by β, is defined by which the spectral ordinate effects, up to the effective period (2T_1), on the intensity measure are taken into account. The methodology permits to meet the hazard-levelled target extreme event in the probabilistic and deterministic forms. A MATLAB code is developed involving OpenSees to calculate the collapse capacities of the 8 archetype RC structures having 2 to 20 stories for regression process. The incremental dynamic analysis (IDA) method is used to calculate the structure’s collapse values accounting for the element stiffness and strength deterioration. The general near field set presented by FEMA is used in a series of performing nonlinear analyses. 8 linear relationships are developed for the 8structutres leading to the correlation coefficient up to 0.93. A collapse capacity near field prediction equation is developed taking into account the results of regression processes obtained from the 8 structures. The proposed prediction equation is validated against a set of actual near field records leading to a good agreement. Implementation of the proposed equation to the four archetype RC structures demonstrated different collapse capacities at near field site compared to those of FEMA. The reasons of differences are believed to be due to accounting for the spectral shape effects.Keywords: collapse capacity, fragility analysis, spectral shape effects, IDA method
Procedia PDF Downloads 2392708 Human Immune Response to Surgery: The Surrogate Prediction of Postoperative Outcomes
Authors: Husham Bayazed
Abstract:
Immune responses following surgical trauma play a pivotal role in predicting postoperative outcomes from healing and recovery to postoperative complications. Postoperative complications, including infections and protracted recovery, occur in a significant number of about 300 million surgeries performed annually worldwide. Complications cause personal suffering along with a significant economic burden on the healthcare system in any community. The accurate prediction of postoperative complications and patient-targeted interventions for their prevention remain major clinical provocations. Recent Findings: Recent studies are focusing on immune dysregulation mechanisms that occur in response to surgical trauma as a key determinant of postoperative complications. Antecedent studies mainly were plunging into the detection of inflammatory plasma markers, which facilitate in providing important clues regarding their pathogenesis. However, recent Single-cell technologies, such as mass cytometry or single-cell RNA sequencing, have markedly enhanced our ability to understand the immunological basis of postoperative immunological trauma complications and to identify their prognostic biological signatures. Summary: The advent of proteomic technologies has significantly advanced our ability to predict the risk of postoperative complications. Multiomic modeling of patients' immune states holds promise for the discovery of preoperative predictive biomarkers and providing patients and surgeons with information to improve surgical outcomes. However, more studies are required to accurately predict the risk of postoperative complications in individual patients.Keywords: immune dysregulation, postoperative complications, surgical trauma, flow cytometry
Procedia PDF Downloads 872707 Studying the Temperature Field of Hypersonic Vehicle Structure with Aero-Thermo-Elasticity Deformation
Authors: Geng Xiangren, Liu Lei, Gui Ye-Wei, Tang Wei, Wang An-ling
Abstract:
The malfunction of thermal protection system (TPS) caused by aerodynamic heating is a latent trouble to aircraft structure safety. Accurately predicting the structure temperature field is quite important for the TPS design of hypersonic vehicle. Since Thornton’s work in 1988, the coupled method of aerodynamic heating and heat transfer has developed rapidly. However, little attention has been paid to the influence of structural deformation on aerodynamic heating and structural temperature field. In the flight, especially the long-endurance flight, the structural deformation, caused by the aerodynamic heating and temperature rise, has a direct impact on the aerodynamic heating and structural temperature field. Thus, the coupled interaction cannot be neglected. In this paper, based on the method of static aero-thermo-elasticity, considering the influence of aero-thermo-elasticity deformation, the aerodynamic heating and heat transfer coupled results of hypersonic vehicle wing model were calculated. The results show that, for the low-curvature region, such as fuselage or center-section wing, structure deformation has little effect on temperature field. However, for the stagnation region with high curvature, the coupled effect is not negligible. Thus, it is quite important for the structure temperature prediction to take into account the effect of elastic deformation. This work has laid a solid foundation for improving the prediction accuracy of the temperature distribution of aircraft structures and the evaluation capacity of structural performance.Keywords: aerothermoelasticity, elastic deformation, structural temperature, multi-field coupling
Procedia PDF Downloads 3412706 The African Notion of Moral Personhood
Authors: Meshandren Naidoo
Abstract:
Personhood is an important philosophical and ethical device that belies many major ethical and legal issues. The concept of African personhood is often overlooked, however, given the decolonization projects occurring in Africa, it is important to consider this view. African personhood, as opposed to Western personhood, is not individualistic in nature. The latter is predominantly Kantian and based on the notion that all persons have equal moral due to their capacity for a reason, whereas communitarianism is central to an African conception of personhood.Keywords: African philosophy, bioethics, ethics, personhood
Procedia PDF Downloads 1222705 Homosexuality and Culture: A Case Study Depicting the Struggles of a Married Lady
Authors: Athulya Jayakumar, M. Manjula
Abstract:
Though there has been a shift in the understanding of homosexuality from being a sin, crime or pathology in the medical and legal perspectives, the acceptance of homosexuality still remains very scanty in the Indian subcontinent. The present case study is a 24-year-old female who has completed a diploma in polytechnic engineering and residing in the state of Kerala. She initially presented with her husband with complaints of lack of sexual desire and non-cooperation from the index client. After an initial few sessions, the client revealed, in an individual session, about her homosexual orientation which was unknown to her family. She has had multiple short-term relations with females and never had any heterosexual orientation/interest. During her adolescence, she was wondering if she could change herself into a male. However, currently, she accepts her gender. She never wanted a heterosexual marriage; but, had to succumb to the pressure of mother, as a result of a series of unexpected incidents at home and had to agree for the marriage, also with a hope that she may change herself into a bi-sexual. The client was able to bond with the husband emotionally but the multiple attempts at sexual intercourse, at the insistence of the husband, had always been non-pleasurable and induced a sense of disgust. Currently, for several months, there has not been any sexual activity. Also, she actively avoids any chance to have a warm communication with him so that she can avoid chances of him approaching her in a sexual manner. The case study is an attempt to highlight the culture and the struggles of a homosexual individual who comes to therapy for wanting to be a ‘normal wife’ despite having knowledge of legal rights and scenario. There is a scarcity of Indian literature that has systematically investigated issues related to homosexuality. Data on prevalence, emotional problems faced and clinical services available are sparse though it is crucial for increasing understanding of sexual behaviour, orientation and difficulties faced in India.Keywords: case study, culture, cognitive behavior therapy, female homosexuality
Procedia PDF Downloads 346