Search results for: sound propagation models
4624 A Survey of Domain Name System Tunneling Attacks: Detection and Prevention
Authors: Lawrence Williams
Abstract:
As the mechanism which converts domains to internet protocol (IP) addresses, Domain Name System (DNS) is an essential part of internet usage. It was not designed securely and can be subject to attacks. DNS attacks have become more frequent and sophisticated and the need for detecting and preventing them becomes more important for the modern network. DNS tunnelling attacks are one type of attack that are primarily used for distributed denial-of-service (DDoS) attacks and data exfiltration. Discussion of different techniques to detect and prevent DNS tunneling attacks is done. The methods, models, experiments, and data for each technique are discussed. A proposal about feasibility is made. Future research on these topics is proposed.Keywords: DNS, tunneling, exfiltration, botnet
Procedia PDF Downloads 754623 High Culture or Low Culture: The Propagation and Popularization of the Classic of Poetry in Modern China
Authors: Fang Tang
Abstract:
A major Confucian masterpiece and the earliest-known poetry anthology (composed approximately 1046-771 BCE), The Classic of Poetry, reflects different cultures in ancient China. It is regarded as a Chinese classic and one of the world’s most significant written works, an essential part of our global cultural heritage. This paper explores how the ancient Chinese classic became transformed into part of popular culture, found in folk songs circulated in Fangxian county, a mountainous location in Hubei province in central mainland China. It is the hometown of one of the most well-known authors of The Classic of Poetry, whose name is Yin Jifu. Local villagers process, refine, and recreate these poems into popular folk songs, which have been handed down from generation to generation. The folk songs based on The Classic of Poetry vividly reflect local customs, life styles, and various cultural activities. After thousands of years of singing these traditional songs, the region has become an important area to maintain part of Chinese cultural heritages; here, the original high culture is converted into a popular culture that is absorbed into people’s daily life. Based on a year’s field research and many interviews with local singers, this paper explores the ways in which locals have transformed the contents of The Classic of Poetry. It examines how today these popular folk songs become part of much-treasured culture heritage, illustrating the transformation of traditional high culture into popular culture. The paper argues that the modern adaptations of the traditional poems of The Classic of Poetry combine both oral and written cultural heritage and reflects the interaction between ancient Chinese official literature and folk literature. The paper also explores the reasons why the folk songs of The Classic of Poetry are so popular in the area, including the influences of its author Yin Jifu, the impact of ancient diasporic culture from the political centre to remote rural areas, and the interactions of local cultures (famous as Chu culture) and Chinese mainstream cultural policies.Keywords: high/low culture, The Classic of Poetry, the functions of media, cultural policy
Procedia PDF Downloads 1044622 Ownership and Shareholder Schemes Effects on Airport Corporate Strategy in Europe
Authors: Dimitrios Dimitriou, Maria Sartzetaki
Abstract:
In the early days of the of civil aviation, airports are totally state-owned companies under the control of national authorities or regional governmental bodies. From that time the picture has totally changed and airports privatisation and airport business commercialisation are key success factors to stimulate air transport demand, generate revenues and attract investors, linked to reliable and resilience of air transport system. Nowadays, airport's corporate strategy deals with policies and actions, affecting essential the business plans, the financial targets and the economic footprint in a regional economy they serving. Therefore, exploring airport corporate strategy is essential to support the decision in business planning, management efficiency, sustainable development and investment attractiveness on one hand; and define policies towards traffic development, revenues generation, capacity expansion, cost efficiency and corporate social responsibility. This paper explores key outputs in airport corporate strategy for different ownership schemes. The airport corporations are grouped in three major schemes: (a) Public, in which the public airport operator acts as part of the government administration or as a corporised public operator; (b) Mixed scheme, in which the majority of the shares and the corporate strategy is driven by the private or the public sector; and (c) Private, in which the airport strategy is driven by the key aspects of globalisation and liberalisation of the aviation sector. By a systemic approach, the key drivers in corporate strategy for modern airport business structures are defined. Key objectives are to define the key strategic opportunities and challenges and assess the corporate goals and risks towards sustainable business development for each scheme. The analysis based on an extensive cross-sectional dataset for a sample of busy European airports providing results on corporate strategy key priorities, risks and business models. The conventional wisdom is to highlight key messages to authorities, institutes and professionals on airport corporate strategy trends and directions.Keywords: airport corporate strategy, airport ownership, airports business models, corporate risks
Procedia PDF Downloads 3044621 Exploration of Hydrocarbon Unconventional Accumulations in the Argillaceous Formation of the Autochthonous Miocene Succession in the Carpathian Foredeep
Authors: Wojciech Górecki, Anna Sowiżdżał, Grzegorz Machowski, Tomasz Maćkowski, Bartosz Papiernik, Michał Stefaniuk
Abstract:
The article shows results of the project which aims at evaluating possibilities of effective development and exploitation of natural gas from argillaceous series of the Autochthonous Miocene in the Carpathian Foredeep. To achieve the objective, the research team develop a world-trend based but unique methodology of processing and interpretation, adjusted to data, local variations and petroleum characteristics of the area. In order to determine the zones in which maximum volumes of hydrocarbons might have been generated and preserved as shale gas reservoirs, as well as to identify the most preferable well sites where largest gas accumulations are anticipated a number of task were accomplished. Evaluation of petrophysical properties and hydrocarbon saturation of the Miocene complex is based on laboratory measurements as well as interpretation of well-logs and archival data. The studies apply mercury porosimetry (MICP), micro CT and nuclear magnetic resonance imaging (using the Rock Core Analyzer). For prospective location (e.g. central part of Carpathian Foredeep – Brzesko-Wojnicz area) reprocessing and reinterpretation of detailed seismic survey data with the use of integrated geophysical investigations has been made. Construction of quantitative, structural and parametric models for selected areas of the Carpathian Foredeep is performed on the basis of integrated, detailed 3D computer models. Modeling are carried on with the Schlumberger’s Petrel software. Finally, prospective zones are spatially contoured in a form of regional 3D grid, which will be framework for generation modelling and comprehensive parametric mapping, allowing for spatial identification of the most prospective zones of unconventional gas accumulation in the Carpathian Foredeep. Preliminary results of research works indicate a potentially prospective area for occurrence of unconventional gas accumulations in the Polish part of Carpathian Foredeep.Keywords: autochthonous Miocene, Carpathian foredeep, Poland, shale gas
Procedia PDF Downloads 2284620 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood
Authors: Randa Alharbi, Vladislav Vyshemirsky
Abstract:
Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)
Procedia PDF Downloads 2024619 Digital Game Fostering Spatial Abilities for Children with Special Needs
Authors: Pedro Barros, Ana Breda, Eugenio Rocha, M. Isabel Santos
Abstract:
As visual and spatial awareness develops, children apprehension of the concept of direction, (relative) distance and (relative) location materializes. Here we present the educational inclusive digital game ORIESPA, under development by the Thematic Line Geometrix, for children aged between 6 and 10 years old, aiming the improvement of their visual and spatial awareness. Visual-spatial abilities are of crucial importance to succeed in many everyday life tasks. Unavoidable in the technological age we are living in, they are essential in many fields of study as, for instance, mathematics.The game, set on a 2D/3D environment, focusses in tasks/challenges on the following categories (1) static orientation of the subject and object, requiring an understanding of the notions of up–down, left–right, front–back, higher-lower or nearer-farther; (2) interpretation of perspectives of three-dimensional objects, requiring the understanding of 2D and 3D representations of three-dimensional objects; and (3) orientation of the subject in real space, requiring the reading and interpreting of itineraries. In ORIESPA, simpler tasks are based on a quadrangular grid, where the front-back and left-right directions and the rotations of 90º, 180º and 270º play the main requirements. The more complex ones are produced on a cubic grid adding the up and down movements. In the first levels, the game's mechanics regarding the reading and interpreting maps (from point A to point B) is based on map routes, following a given set of instructions. In higher levels, the player must produce a list of instructions taking the game character to the desired destination, avoiding obstacles. Being an inclusive game the user has the possibility to interact through the mouse (point and click with a single button), the keyboard (small set of well recognized keys) or a Kinect device (using simple gesture moves). The character control requires the action on buttons corresponding to movements in 2D and 3D environments. Buttons and instructions are also complemented with text, sound and sign language.Keywords: digital game, inclusion, itinerary, spatial ability
Procedia PDF Downloads 1804618 The Collaboration between Resident and Non-resident Patent Applicants as a Strategy to Accelerate Technological Advance in Developing Nations
Authors: Hugo Rodríguez
Abstract:
Migrations of researchers, scientists, and inventors are a widespread phenomenon in modern times. In some cases, migrants stay linked to research groups in their countries of origin, either out of their own conviction or because of government policies. We examine different linear models of technological development (using the Ordinary Least Squares (OLS) technique) in eight selected countries and find that the collaborations between resident and nonresident patent applicants correlate with different levels of performance of the technological policies in three different scenarios. Therefore, the reinforcement of that link must be considered a powerful tool for technological development.Keywords: development, collaboration, patents, technology
Procedia PDF Downloads 1274617 Study on the Model Predicting Post-Construction Settlement of Soft Ground
Authors: Pingshan Chen, Zhiliang Dong
Abstract:
In order to estimate the post-construction settlement more objectively, the power-polynomial model is proposed, which can reflect the trend of settlement development based on the observed settlement data. It was demonstrated by an actual case history of an embankment, and during the prediction. Compared with the other three prediction models, the power-polynomial model can estimate the post-construction settlement more accurately with more simple calculation.Keywords: prediction, model, post-construction settlement, soft ground
Procedia PDF Downloads 4254616 Conceptual Model for Logistics Information System
Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves
Abstract:
Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.Keywords: system, information, conceptual model, logistics
Procedia PDF Downloads 4964615 Electromagnetic Tuned Mass Damper Approach for Regenerative Suspension
Authors: S. Kopylov, C. Z. Bo
Abstract:
This study is aimed at exploring the possibility of energy recovery through the suppression of vibrations. The article describes design of electromagnetic dynamic damper. The magnetic part of the device performs the function of a tuned mass damper, thereby providing both energy regeneration and damping properties to the protected mass. According to the theory of tuned mass damper, equations of mathematical models were obtained. Then, under given properties of current system, amplitude frequency response was investigated. Therefore, main ideas and methods for further research were defined.Keywords: electromagnetic damper, oscillations with two degrees of freedom, regeneration systems, tuned mass damper
Procedia PDF Downloads 2074614 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 1674613 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanismsKeywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 1594612 An Investigative Study into Good Governance in the Non-Profit Sector in South Africa: A Systems Approach Perspective
Authors: Frederick M. Dumisani Xaba, Nokuthula G. Khanyile
Abstract:
There is a growing demand for greater accountability, transparency and ethical conduct based on sound governance principles in the developing world. Funders, donors and sponsors are increasingly demanding more transparency, better value for money and adherence to good governance standards. The drive towards improved governance measures is largely influenced by the need to ‘plug the leaks’, deal with malfeasance, engender greater levels of accountability and good governance and to ultimately attract further funding or investment. This is the case with the Non-Profit Organizations (NPOs) in South Africa in general, and in the province of KwaZulu-Natal in particular. The paper draws from the good governance theory, stakeholder theory and systems thinking to critically examine the requirements for good governance for the NPO sector from a theoretical and legislative point and to systematically looks at the contours of governance currently among the NPOs. The paper did this through the rigorous examination of the vignettes of cases of governance among selected NPOs based in KwaZulu-Natal. The study used qualitative and quantitative research methodologies through document analysis, literature review, semi-structured interviews, focus groups and statistical analysis from the various primary and secondary sources. It found some good cases of good governance but also found frightening levels of poor governance. There was an exponential growth of NPOs registered during the period under review, equally so there was an increase in cases of non-compliance to good governance practices. NPOs operate in an increasingly complex environment. There is contestation for influence and access to resources. Stakeholder management is poorly conceptualized and executed. Recognizing that the NPO sector operates in an environment characterized by complexity, constant changes, unpredictability, contestation, diversity and divergent views of different stakeholders, there is a need to apply legislative and systems thinking approaches to strengthen governance to withstand this turbulence through a capacity development model that recognizes these contextual and environmental challenges.Keywords: good governance, non-profit organizations, stakeholder theory, systems theory
Procedia PDF Downloads 1224611 Simplifying Writing Composition to Assist Students in Rural Areas: An Experimental Study for the Comparison of Guided and Unguided Instruction
Authors: Neha Toppo
Abstract:
Method and strategies of teaching instruction highly influence learning of students. In second language teaching, number of ways and methods has been suggested by different scholars and researchers through times. The present article deals with the role of teaching instruction in developing compositional ability of students in writing. It focuses on the secondary level students of rural areas, whose exposure to English language is limited and they face challenges even in simple compositions. The students till high school suffer with their disability in writing formal letter, application, essay, paragraph etc. They face problem in note making, writing answers in examination using their own words and depend fully on rote learning. It becomes difficult for them to give language to their own ideas. Teaching writing composition deserves special attention as writing is an integral part of language learning and students at this level are expected to have sound compositional ability for it is useful in numerous domains. Effective method of instruction could help students to learn expression of self, correct selection of vocabulary and grammar, contextual writing, composition of formal and informal writing. It is not limited to school but continues to be important in various other fields outside the school such as in newspaper and magazine, official work, legislative work, material writing, academic writing, personal writing, etc. The study is based on the experimental method, which hypothesize that guided instruction will be more effective in teaching writing compositions than usual instruction in which students are left to compose by their own without any help. In the test, students of one section are asked to write an essay on the given topic without guidance and another section are asked to write the same but with the assistance of guided instruction in which students have been provided with a few vocabulary and sentence structure. This process is repeated in few more schools to get generalize data. The study shows the difference on students’ performance using both the instructions; guided and unguided. The conclusion of the study is followed by the finding that writing skill of the students is quite poor but with the help of guided instruction they perform better. The students are in need of better teaching instruction to develop their writing skills.Keywords: composition, essay, guided instruction, writing skill
Procedia PDF Downloads 2794610 Revolutionizing Legal Drafting: Leveraging Artificial Intelligence for Efficient Legal Work
Authors: Shreya Poddar
Abstract:
Legal drafting and revising are recognized as highly demanding tasks for legal professionals. This paper introduces an approach to automate and refine these processes through the use of advanced Artificial Intelligence (AI). The method employs Large Language Models (LLMs), with a specific focus on 'Chain of Thoughts' (CoT) and knowledge injection via prompt engineering. This approach differs from conventional methods that depend on comprehensive training or fine-tuning of models with extensive legal knowledge bases, which are often expensive and time-consuming. The proposed method incorporates knowledge injection directly into prompts, thereby enabling the AI to generate more accurate and contextually appropriate legal texts. This approach substantially decreases the necessity for thorough model training while preserving high accuracy and relevance in drafting. Additionally, the concept of guardrails is introduced. These are predefined parameters or rules established within the AI system to ensure that the generated content adheres to legal standards and ethical guidelines. The practical implications of this method for legal work are considerable. It has the potential to markedly lessen the time lawyers allocate to document drafting and revision, freeing them to concentrate on more intricate and strategic facets of legal work. Furthermore, this method makes high-quality legal drafting more accessible, possibly reducing costs and expanding the availability of legal services. This paper will elucidate the methodology, providing specific examples and case studies to demonstrate the effectiveness of 'Chain of Thoughts' and knowledge injection in legal drafting. The potential challenges and limitations of this approach will also be discussed, along with future prospects and enhancements that could further advance legal work. The impact of this research on the legal industry is substantial. The adoption of AI-driven methods by legal professionals can lead to enhanced efficiency, precision, and consistency in legal drafting, thereby altering the landscape of legal work. This research adds to the expanding field of AI in law, introducing a method that could significantly alter the nature of legal drafting and practice.Keywords: AI-driven legal drafting, legal automation, futureoflegalwork, largelanguagemodels
Procedia PDF Downloads 644609 The Investigate Relationship between Moral Hazard and Corporate Governance with Earning Forecast Quality in the Tehran Stock Exchange
Authors: Fatemeh Rouhi, Hadi Nassiri
Abstract:
Earning forecast is a key element in economic decisions but there are some situations, such as conflicts of interest in financial reporting, complexity and lack of direct access to information has led to the phenomenon of information asymmetry among individuals within the organization and external investors and creditors that appear. The adverse selection and moral hazard in the investor's decision and allows direct assessment of the difficulties associated with data by users makes. In this regard, the role of trustees in corporate governance disclosure is crystallized that includes controls and procedures to ensure the lack of movement in the interests of the company's management and move in the direction of maximizing shareholder and company value. Therefore, the earning forecast of companies in the capital market and the need to identify factors influencing this study was an attempt to make relationship between moral hazard and corporate governance with earning forecast quality companies operating in the capital market and its impact on Earnings Forecasts quality by the company to be established. Getting inspiring from the theoretical basis of research, two main hypotheses and sub-hypotheses are presented in this study, which have been examined on the basis of available models, and with the use of Panel-Data method, and at the end, the conclusion has been made at the assurance level of 95% according to the meaningfulness of the model and each independent variable. In examining the models, firstly, Chow Test was used to specify either Panel Data method should be used or Pooled method. Following that Housman Test was applied to make use of Random Effects or Fixed Effects. Findings of the study show because most of the variables are positively associated with moral hazard with earnings forecasts quality, with increasing moral hazard, earning forecast quality companies listed on the Tehran Stock Exchange is increasing. Among the variables related to corporate governance, board independence variables have a significant relationship with earnings forecast accuracy and earnings forecast bias but the relationship between board size and earnings forecast quality is not statistically significant.Keywords: corporate governance, earning forecast quality, moral hazard, financial sciences
Procedia PDF Downloads 3224608 Modelling the Effect of Alcohol Consumption on the Accelerating and Braking Behaviour of Drivers
Authors: Ankit Kumar Yadav, Nagendra R. Velaga
Abstract:
Driving under the influence of alcohol impairs the driving performance and increases the crash risks worldwide. The present study investigated the effect of different Blood Alcohol Concentrations (BAC) on the accelerating and braking behaviour of drivers with the help of driving simulator experiments. Eighty-two licensed Indian drivers drove on the rural road environment designed in the driving simulator at BAC levels of 0.00%, 0.03%, 0.05%, and 0.08% respectively. Driving performance was analysed with the help of vehicle control performance indicators such as mean acceleration and mean brake pedal force of the participants. Preliminary analysis reported an increase in mean acceleration and mean brake pedal force with increasing BAC levels. Generalized linear mixed models were developed to quantify the effect of different alcohol levels and explanatory variables such as driver’s age, gender and other driver characteristic variables on the driving performance indicators. Alcohol use was reported as a significant factor affecting the accelerating and braking performance of the drivers. The acceleration model results indicated that mean acceleration of the drivers increased by 0.013 m/s², 0.026 m/s² and 0.027 m/s² for the BAC levels of 0.03%, 0.05% and 0.08% respectively. Results of the brake pedal force model reported that mean brake pedal force of the drivers increased by 1.09 N, 1.32 N and 1.44 N for the BAC levels of 0.03%, 0.05% and 0.08% respectively. Age was a significant factor in both the models where one year increase in drivers’ age resulted in 0.2% reduction in mean acceleration and 19% reduction in mean brake pedal force of the drivers. It shows that driving experience could compensate for the negative effects of alcohol to some extent while driving. Female drivers were found to accelerate slower and brake harder as compared to the male drivers which confirmed that female drivers are more conscious about their safety while driving. It was observed that drivers who were regular exercisers had better control on their accelerator pedal as compared to the non-regular exercisers during drunken driving. The findings of the present study revealed that drivers tend to be more aggressive and impulsive under the influence of alcohol which deteriorates their driving performance. Drunk driving state can be differentiated from sober driving state by observing the accelerating and braking behaviour of the drivers. The conclusions may provide reference in making countermeasures against drinking and driving and contribute to traffic safety.Keywords: alcohol, acceleration, braking behaviour, driving simulator
Procedia PDF Downloads 1464607 Four-Electron Auger Process for Hollow Ions
Authors: Shahin A. Abdel-Naby, James P. Colgan, Michael S. Pindzola
Abstract:
A time-dependent close-coupling method is developed to calculate a total, double and triple autoionization rates for hollow atomic ions of four-electron systems. This work was motivated by recent observations of the four-electron Auger process in near K-edge photoionization of C+ ions. The time-dependent close-coupled equations are solved using lattice techniques to obtain a discrete representation of radial wave functions and all operators on a four-dimensional grid with uniform spacing. Initial excited states are obtained by relaxation of the Schrodinger equation in imaginary time using a Schmidt orthogonalization method involving interior subshells. The radial wave function grids are partitioned over the cores on a massively parallel computer, which is essential due to the large memory requirements needed to store the coupled-wave functions and the long run times needed to reach the convergence of the ionization process. Total, double, and triple autoionization rates are obtained by the propagation of the time-dependent close-coupled equations in real-time using integration over bound and continuum single-particle states. These states are generated by matrix diagonalization of one-electron Hamiltonians. The total autoionization rates for each L excited state is found to be slightly above the single autoionization rate for the excited configuration using configuration-average distorted-wave theory. As expected, we find the double and triple autoionization rates to be much smaller than the total autoionization rates. Future work can be extended to study electron-impact triple ionization of atoms or ions. The work was supported in part by grants from the American University of Sharjah and the US Department of Energy. Computational work was carried out at the National Energy Research Scientific Computing Center (NERSC) in Berkeley, California, USA.Keywords: hollow atoms, autoionization, auger rates, time-dependent close-coupling method
Procedia PDF Downloads 1534606 Performance of Reinforced Concrete Wall with Opening Using Analytical Model
Authors: Alaa Morsy, Youssef Ibrahim
Abstract:
Earthquake is one of the most catastrophic events, which makes enormous harm to properties and human lives. As a piece of a safe building configuration, reinforced concrete walls are given in structures to decrease horizontal displacements under seismic load. Shear walls are additionally used to oppose the horizontal loads that might be incited by the impact of wind. Reinforced concrete walls in residential buildings might have openings that are required for windows in outside walls or for doors in inside walls or different states of openings due to architectural purposes. The size, position, and area of openings may fluctuate from an engineering perspective. Shear walls can encounter harm around corners of entryways and windows because of advancement of stress concentration under the impact of vertical or horizontal loads. The openings cause a diminishing in shear wall capacity. It might have an unfavorable impact on the stiffness of reinforced concrete wall and on the seismic reaction of structures. Finite Element Method using software package ‘ANSYS ver. 12’ becomes an essential approach in analyzing civil engineering problems numerically. Now we can make various models with different parameters in short time by using ANSYS instead of doing it experimentally, which consumes a lot of time and money. Finite element modeling approach has been conducted to study the effect of opening shape, size and position in RC wall with different thicknesses under axial and lateral static loads. The proposed finite element approach has been verified with experimental programme conducted by the researchers and validated by their variables. A very good correlation has been observed between the model and experimental results including load capacity, failure mode, and lateral displacement. A parametric study is applied to investigate the effect of opening size, shape, position on different reinforced concrete wall thicknesses. The results may be useful for improving existing design models and to be applied in practice, as it satisfies both the architectural and the structural requirements.Keywords: Ansys, concrete walls, openings, out of plane behavior, seismic, shear wall
Procedia PDF Downloads 1674605 Capacitance Models of AlGaN/GaN High Electron Mobility Transistors
Authors: A. Douara, N. Kermas, B. Djellouli
Abstract:
In this study, we report calculations of gate capacitance of AlGaN/GaN HEMTs with nextnano device simulation software. We have used a physical gate capacitance model for III-V FETs that incorporates quantum capacitance and centroid capacitance in the channel. These simulations explore various device structures with different values of barrier thickness and channel thickness. A detailed understanding of the impact of gate capacitance in HEMTs will allow us to determine their role in future 10 nm physical gate length node.Keywords: gate capacitance, AlGaN/GaN, HEMTs, quantum capacitance, centroid capacitance
Procedia PDF Downloads 3964604 Evaluation of a Staffing to Workload Tool in a Multispecialty Clinic Setting
Authors: Kristin Thooft
Abstract:
— Increasing pressure to manage healthcare costs has resulted in shifting care towards ambulatory settings and is driving a focus on cost transparency. There are few nurse staffing to workload models developed for ambulatory settings, less for multi-specialty clinics. Of the existing models, few have been evaluated against outcomes to understand any impact. This evaluation took place after the AWARD model for nurse staffing to workload was implemented in a multi-specialty clinic at a regional healthcare system in the Midwest. The multi-specialty clinic houses 26 medical and surgical specialty practices. The AWARD model was implemented in two specialty practices in October 2020. Donabedian’s Structure-Process-Outcome (SPO) model was used to evaluate outcomes based on changes to the structure and processes of care provided. The AWARD model defined and quantified the processes, recommended changes in the structure of day-to-day nurse staffing. Cost of care per patient visit, total visits, a total nurse performed visits used as structural and process measures, influencing the outcomes of cost of care and access to care. Independent t-tests were used to compare the difference in variables pre-and post-implementation. The SPO model was useful as an evaluation tool, providing a simple framework that is understood by a diverse care team. No statistically significant changes in the cost of care, total visits, or nurse visits were observed, but there were differences. Cost of care increased and access to care decreased. Two weeks into the post-implementation period, the multi-specialty clinic paused all non-critical patient visits due to a second surge of the COVID-19 pandemic. Clinic nursing staff was re-allocated to support the inpatient areas. This negatively impacted the ability of the Nurse Manager to utilize the AWARD model to plan daily staffing fully. The SPO framework could be used for the ongoing assessment of nurse staffing performance. Additional variables could be measured, giving a complete picture of the impact of nurse staffing. Going forward, there must be a continued focus on the outcomes of care and the value of nursingKeywords: ambulatory, clinic, evaluation, outcomes, staffing, staffing model, staffing to workload
Procedia PDF Downloads 1734603 Some Considerations about the Theory of Spatial-Motor Thinking Applied to a Traditional Fife Band in Brazil
Authors: Murilo G. Mendes
Abstract:
This text presents part of the results presented in the Ph.D. thesis that has used John Baily's theory and method as well as its ethnographic application in the context of the fife flutes of the Banda Cabaçal dos Irmãos Aniceto in the state of Ceará, northeast of Brazil. John Baily is a British ethnomusicologist dedicated to studying the relationships between music, musical gesture, and embodied cognition. His methodology became a useful tool to highlight historical-social aspects present in the group's instrumental music. Remaining indigenous and illiterate, these musicians played and transmitted their music from generation to generation, for almost two hundred years, without any nomenclature or systematization of the fingering performed on the flute. In other words, his music, free from any theorization, is learned, felt, perceived, and processed directly through hearing and through the relationship between the instrument's motor skills and its sound result. For this reason, Baily's assumptions became fundamental in the analysis processes. As the author's methodology recommends, classes were held with the natives and provided technical musical learning and some important concepts. Then, transcriptions and analyses of musical aspects were made from patterns of movement on the instrument incorporated by repetitions and/or by the intrinsic facility of the instrument. As a result, it was discovered how the group reconciled its indigenous origins with the demand requested by the public power and the interests of the local financial elite from the mid-twentieth century. The article is structured from the cultural context of the group, where local historical and social aspects influence the social and musical practices of the group. Then, will be present the methodological conceptions of John Baily and, finally, their application in the music of the Irmãos Aniceto. The conclusion points to the good results of identifying, through this methodology and analysis, approximations between discourse, historical-social factors, and musical text. Still, questions are raised about its application in other contexts.Keywords: Banda Cabaçal dos Irmãos Aniceto, John Baily, pífano, spatial-motor thinking
Procedia PDF Downloads 1354602 Detecting Covid-19 Fake News Using Deep Learning Technique
Authors: AnjalI A. Prasad
Abstract:
Nowadays, social media played an important role in spreading misinformation or fake news. This study analyzes the fake news related to the COVID-19 pandemic spread in social media. This paper aims at evaluating and comparing different approaches that are used to mitigate this issue, including popular deep learning approaches, such as CNN, RNN, LSTM, and BERT algorithm for classification. To evaluate models’ performance, we used accuracy, precision, recall, and F1-score as the evaluation metrics. And finally, compare which algorithm shows better result among the four algorithms.Keywords: BERT, CNN, LSTM, RNN
Procedia PDF Downloads 2054601 Next Generation Radiation Risk Assessment and Prediction Tools Generation Applying AI-Machine (Deep) Learning Algorithms
Authors: Selim M. Khan
Abstract:
Indoor air quality is strongly influenced by the presence of radioactive radon (222Rn) gas. Indeed, exposure to high 222Rn concentrations is unequivocally linked to DNA damage and lung cancer and is a worsening issue in North American and European built environments, having increased over time within newer housing stocks as a function of as yet unclear variables. Indoor air radon concentration can be influenced by a wide range of environmental, structural, and behavioral factors. As some of these factors are quantitative while others are qualitative, no single statistical model can determine indoor radon level precisely while simultaneously considering all these variables across a complex and highly diverse dataset. The ability of AI- machine (deep) learning to simultaneously analyze multiple quantitative and qualitative features makes it suitable to predict radon with a high degree of precision. Using Canadian and Swedish long-term indoor air radon exposure data, we are using artificial deep neural network models with random weights and polynomial statistical models in MATLAB to assess and predict radon health risk to human as a function of geospatial, human behavioral, and built environmental metrics. Our initial artificial neural network with random weights model run by sigmoid activation tested different combinations of variables and showed the highest prediction accuracy (>96%) within the reasonable iterations. Here, we present details of these emerging methods and discuss strengths and weaknesses compared to the traditional artificial neural network and statistical methods commonly used to predict indoor air quality in different countries. We propose an artificial deep neural network with random weights as a highly effective method for assessing and predicting indoor radon.Keywords: radon, radiation protection, lung cancer, aI-machine deep learnng, risk assessment, risk prediction, Europe, North America
Procedia PDF Downloads 964600 Finite Element Analysis of Human Tarsals, Meta Tarsals and Phalanges for Predicting probable location of Fractures
Authors: Irfan Anjum Manarvi, Fawzi Aljassir
Abstract:
Human bones have been a keen area of research over a long time in the field of biomechanical engineering. Medical professionals, as well as engineering academics and researchers, have investigated various bones by using medical, mechanical, and materials approaches to discover the available body of knowledge. Their major focus has been to establish properties of these and ultimately develop processes and tools either to prevent fracture or recover its damage. Literature shows that mechanical professionals conducted a variety of tests for hardness, deformation, and strain field measurement to arrive at their findings. However, they considered these results accuracy to be insufficient due to various limitations of tools, test equipment, difficulties in the availability of human bones. They proposed the need for further studies to first overcome inaccuracies in measurement methods, testing machines, and experimental errors and then carry out experimental or theoretical studies. Finite Element analysis is a technique which was developed for the aerospace industry due to the complexity of design and materials. But over a period of time, it has found its applications in many other industries due to accuracy and flexibility in selection of materials and types of loading that could be theoretically applied to an object under study. In the past few decades, the field of biomechanical engineering has also started to see its applicability. However, the work done in the area of Tarsals, metatarsals and phalanges using this technique is very limited. Therefore, present research has been focused on using this technique for analysis of these critical bones of the human body. This technique requires a 3-dimensional geometric computer model of the object to be analyzed. In the present research, a 3d laser scanner was used for accurate geometric scans of individual tarsals, metatarsals, and phalanges from a typical human foot to make these computer geometric models. These were then imported into a Finite Element Analysis software and a length refining process was carried out prior to analysis to ensure the computer models were true representatives of actual bone. This was followed by analysis of each bone individually. A number of constraints and load conditions were applied to observe the stress and strain distributions in these bones under the conditions of compression and tensile loads or their combination. Results were collected for deformations in various axis, and stress and strain distributions were observed to identify critical locations where fracture could occur. A comparative analysis of failure properties of all the three types of bones was carried out to establish which of these could fail earlier which is presented in this research. Results of this investigation could be used for further experimental studies by the academics and researchers, as well as industrial engineers, for development of various foot protection devices or tools for surgical operations and recovery treatment of these bones. Researchers could build up on these models to carryout analysis of a complete human foot through Finite Element analysis under various loading conditions such as walking, marching, running, and landing after a jump etc.Keywords: tarsals, metatarsals, phalanges, 3D scanning, finite element analysis
Procedia PDF Downloads 3294599 Life Cycle Assessment of Biogas Energy Production from a Small-Scale Wastewater Treatment Plant in Central Mexico
Authors: Joel Bonales, Venecia Solorzano, Carlos Garcia
Abstract:
A great percentage of the wastewater generated in developing countries don’t receive any treatment, which leads to numerous environmental impacts. In response to this, a paradigm change in the current wastewater treatment model based on large scale plants towards a small and medium scale based model has been proposed. Nevertheless, small scale wastewater treatment (SS-WTTP) with novel technologies such as anaerobic digesters, as well as the utilization of derivative co-products such as biogas, still presents diverse environmental impacts which must be assessed. This study consisted in a Life Cycle Assessment (LCA) performed to a SS-WWTP which treats wastewater from a small commercial block in the city of Morelia, Mexico. The treatment performed in the SS-WWTP consists in anaerobic and aerobic digesters with a daily capacity of 5,040 L. Two different scenarios were analyzed: the current plant conditions and a hypothetical energy use of biogas obtained in situ. Furthermore, two different allocation criteria were applied: full impact allocation to the system’s main product (treated water) and substitution credits for replacing Mexican grid electricity (biogas) and clean water pumping (treated water). The results showed that the analyzed plant had bigger impacts than what has been reported in the bibliography in the basis of wastewater volume treated, which may imply that this plant is currently operating inefficiently. The evaluated impacts appeared to be focused in the aerobic digestion and electric generation phases due to the plant’s particular configuration. Additional findings prove that the allocation criteria applied is crucial for the interpretation of impacts and that that the energy use of the biogas obtained in this plant can help mitigate associated climate change impacts. It is concluded that SS-WTTP is a environmentally sound alternative for wastewater treatment from a systemic perspective. However, this type of studies must be careful in the selection of the allocation criteria and replaced products, since these factors have a great influence in the results of the assessment.Keywords: biogas, life cycle assessment, small scale treatment, wastewater treatment
Procedia PDF Downloads 1244598 To What Extent Does Physical Activity and Standard of Competition Affect Quantitative Ultrasound (QUS) Measurements of Bone in Accordance with Muscular Strength and Anthropometrics in British Young Males?
Authors: Joseph Shanks, Matthew Taylor, Foong Kiew Ooi, Chee Keong Chen
Abstract:
Introduction: Evidences of relationship between bone, muscle and standard of competition among young British population is limited in literature. The current literature recognises the independent and synergistic effects of fat free and fat mass as the stimulus for osteogenesis. This study assessed the extent to which physical activity (PA) and standard of competition (CS) influences quantitative ultrasound (QUS) measurements of bone on a cross-sectional basis accounting for muscular strength and anthropometrics in British young males. Methods: Pre-screening grouped 66 males aged 18-25 years into controls (n=33) and district level athletes (DLAs) (n=33) as well as low (n=21), moderate (n=23) and high (n=22) physical activity categories (PACs). All participants underwent QUS measurements of bone (4 sites, i.e. dominant distal radius (DR), dominant mid-shaft tibia (DT), non-dominant distal radius (NR) and non-dominant mid-shaft tibia (NT)), isokinetic strength tests (dominant and non-dominant knee flexion and extension) and anthropometric measurements. Results: There were no significant differences between any of the groups with respect to QUS measurements of bone at all sites with regards to PACs or CS. Significant higher isokinetic strength values were observed in DLAs than controls (p < 0.05), and higher than low PACs (p < 0.05) at 60o.s-1 of concentric and eccentric measurements. No differences in subcutaneous fat thickness were found between all the groups (CS or PACs). Percentages of body fat were significantly higher (p < .05) in low than high PACs and CS groups. There were significant positive relationships between non dominant radial speed of sound and fat free mass at both DR (r=0.383, p=0.001) and NR (r=0.319, p=0.009) sites in all participants. Conclusion: The present study findings indicated that muscular strength and body fat are closely related to physical activity level and standard of competition. However, bone health status reflected by quantitative ultrasound (QUS) measurements of bone is not related to physical activity level and standard of competition in British young males.Keywords: bone, muscular strength, physical activity, standard of competition
Procedia PDF Downloads 5144597 Tracking the Mind's Mouth: Use of Smart Technology for Effective Teaching of Speaking to Pupils with Developmental Co-ordination Disorder
Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Ayah Al Yaari, Ayman Al Yaari, Montaha Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Fatehi Eissa
Abstract:
Developmental co-ordination disorder (DCD) (also known as dyspraxia) causes a child to speak less well than expected in social conversations. We propose that the smart speaking technology could help improve sound production mechanism at both phonetic and phonological levels, which leads to better articulation of an utterance. The participants are twelve privately beginner pupils aged between 6-12 years old and diagnosed with DCD (apraxia) divided into two groups: Experimental group (n=6) and control group (called apraxic control group) (n=6). A total of fifty typically developing and achieving (TD) pupils participated as control group 2 in both groups and were preassigned into two groups (27 pupils with the treatment group and 23 with the apraxic control group). Weekly quizzes were given to all participants each week for four continuous months and results were analyzed by psychoneurolinguists and a statistician. Although being taught by the same speech-language therapist (SLT), treatment group along with TD groups were taught a full-time speaking course with sociolinguistic themes covering both phonetic and phonological properties. The course lasted for a whole semester whereby smart speaking aids have become dominant while apraxic control group and its TD group were not. Compared with apraxic control group and its TD subgroup, results show obvious changes in speaking behavioral mechanism of the DCD experimental group and its TD subgroup. Improvement could be taken from the scores where the zero marks disappeared in the fourth week (end of the first month of treatment). Good marks (5 +/10) were seen starting from the eighth week and culminating with full marks in the week number 15 of treatment where some participants scored full mark. This study concludes to support the primacy of the smart educational technology for speaking purposes and also shows that such aids can expand the range of academic performance differential categories. Further research is required to evaluate the current demonizing of smart educational aids and weighting more reasonably the relationship specificity that speaking aids can offer to other language skills, as well as their limitations.Keywords: smart educational technology, speaking aids, pupils with SCD, apraxia
Procedia PDF Downloads 504596 The Spatial and Temporal Distribution of Ambient Benzene, Toluene, Ethylbenzene and Xylene Concentrations at an International Airport in South Africa
Authors: Ryan S. Johnson, Raeesa Moolla
Abstract:
Airports are known air pollution hotspots due to the variety of fuel driven activities that take place within the confines of them. As such, people working within airports are particularly vulnerable to exposure of hazardous air pollutants, including hundreds of aromatic hydrocarbons, and more specifically a group of compounds known as BTEX (viz. benzene, toluene, ethyl-benzene and xylenes). These compounds have been identified as being harmful to human and environmental health. Through the use of passive and active sampling methods, the spatial and temporal variability of benzene, toluene, ethyl-benzene and xylene concentrations within the international airport was investigated. Two sampling campaigns were conducted. In order to quantify the temporal variability of concentrations within the airport, an active sampling strategy using the Synspec Spectras Gas Chromatography 955 instrument was used. Furthermore, a passive sampling campaign, using Radiello Passive Samplers was used to quantify the spatial variability of these compounds. In addition, meteorological factors are known to affect the dispersal and dilution of pollution. Thus a Davis Pro-Weather 2 station was utilised in order to measure in situ weather parameters (viz. wind speed, wind direction and temperature). Results indicated that toluene varied on a daily, temporal scale considerably more than other concentrations. Toluene further exhibited a strong correlation with regards to the meteorological parameters, inferring that toluene was affected by these parameters to a greater degree than the other pollutants. The passive sampling campaign revealed BTEXtotal concentrations ranged between 12.95 – 124.04 µg m-3. From the results obtained it is clear that benzene, toluene, ethyl-benzene and xylene concentrations are heterogeneously spatially dispersed within the airport. Due to the slow wind speeds recorded over the passive sampling campaign (1.13 m s-1.), the hotspots were located close to the main concentration sources. The most significant hotspot was located over the main apron of the airport. It is recommended that further, extensive investigations into the seasonality of hazardous air pollutants at the airport is necessary in order for sound conclusions to be made about the temporal and spatial distribution of benzene, toluene, ethyl-benzene and xylene concentrations within the airport.Keywords: airport, air pollution hotspot, BTEX concentrations, meteorology
Procedia PDF Downloads 2044595 The EU Omnipotence Paradox: Inclusive Cultural Policies and Effects of Exclusion
Authors: Emmanuel Pedler, Elena Raevskikh, Maxime Jaffré
Abstract:
Can the cultural geography of European cities be durably managed by European policies? To answer this question, two hypotheses can be proposed. (1) Either European cultural policies are able to erase cultural inequalities between the territories through the creation of new areas of cultural attractiveness in each beneficiary neighborhood, city or country. Or, (2) each European region historically rooted in a number of endogenous socio-historical, political or demographic factors is not receptive to exogenous political influences. Thus, the cultural attractiveness of a territory is difficult to measure and to impact by top-down policies in the long term. How do these two logics - European and local - interact and contribute to the emergence of a valued, popular sense of a common European cultural identity? Does this constant interaction between historical backgrounds and new political concepts encourage a positive identification with the European project? The European cultural policy programs, such as ECC (European Capital of Culture), seek to develop new forms of civic cohesion through inclusive and participative cultural events. The cultural assets of a city elected ‘ECC’ are mobilized to attract a wide range of new audiences, including populations poorly integrated into local cultural life – and consequently distant from pre-existing cultural offers. In the current context of increasingly heterogeneous individual perceptions of Europe, the ECC program aims to promote cultural forms and institutions that should accelerate both territorial and cross-border European cohesion. The new cultural consumption pattern is conceived to stimulate integration and mobility, but also to create a legitimate and transnational ideal European citizen type. Our comparative research confronts contrasting cases of ‘European Capitals of Culture’ from the south and from the north of Europe, cities recently concerned by the ECC political mechanism and cities that were elected ECC in the past, multi-centered cultural models vs. highly centralized cultural models. We aim to explore the impacts of European policies on the urban cultural geography, but also to understand the current obstacles for its efficient implementation.Keywords: urbanism, cultural policies, cultural institutions, european cultural capitals, heritage industries, exclusion effects
Procedia PDF Downloads 261