Search results for: data analyses
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27120

Search results for: data analyses

22140 Bayesian Inference of Physicochemical Quality Elements of Tropical Lagoon Nokoué (Benin)

Authors: Hounyèmè Romuald, Maxime Logez, Mama Daouda, Argillier Christine

Abstract:

In view of the very strong degradation of aquatic ecosystems, it is urgent to set up monitoring systems that are best able to report on the effects of the stresses they undergo. This is particularly true in developing countries, where specific and relevant quality standards and funding for monitoring programs are lacking. The objective of this study was to make a relevant and objective choice of physicochemical parameters informative of the main stressors occurring on African lakes and to identify their alteration thresholds. Based on statistical analyses of the relationship between several driving forces and the physicochemical parameters of the Nokoué lagoon, relevant Physico-chemical parameters were selected for its monitoring. An innovative method based on Bayesian statistical modeling was used. Eleven Physico-chemical parameters were selected for their response to at least one stressor and their threshold quality standards were also established: Total Phosphorus (<4.5mg/L), Orthophosphates (<0.2mg/L), Nitrates (<0.5 mg/L), TKN (<1.85 mg/L), Dry Organic Matter (<5 mg/L), Dissolved Oxygen (>4 mg/L), BOD (<11.6 mg/L), Salinity (7.6 .), Water Temperature (<28.7 °C), pH (>6.2), and Transparency (>0.9 m). According to the System for the Evaluation of Coastal Water Quality, these thresholds correspond to” good to medium” suitability classes, except for total phosphorus. One of the original features of this study is the use of the bounds of the credibility interval of the fixed-effect coefficients as local weathering standards for the characterization of the Physico-chemical status of this anthropized African ecosystem.

Keywords: driving forces, alteration thresholds, acadjas, monitoring, modeling, human activities

Procedia PDF Downloads 88
22139 Inter-Cell-Interference Mitigation Scheme in Wireless Communication System

Authors: Jae-Hyun Ro, Yong-Jun Kim, Eui-Hak Lee, Hyoung-Kyu Song

Abstract:

Mobile communication has been developing very rapidly since it appeared. However, although mobile communication market has been rapidly developing, many mobile users are not offered good quality of service (QoS) due to increment of the amount of data traffic. Recently, femtocell is very hot issue in mobile communication because femtocell can solve the problems of data traffic and offer better QoS to mobile users. However, the deployment of femtocell in existing macrocell coverage area is not so simple due to the influence of inter-cell-interference (ICI) with existing macrocell. Thus, this paper proposes femtocell scheme which is able to reduce the influence of ICI to deploy femtocell easily.

Keywords: CDD, femtocell, interference, macrocell, OFDM

Procedia PDF Downloads 499
22138 Security Analysis and Implementation of Achterbahn-128 for Images Encryption

Authors: Aissa Belmeguenai, Oulaya Berrak, Khaled Mansouri

Abstract:

In this work, efficiency implementation and security evaluation of the keystream generator of Achterbahn-128 for images encryption and decryption was introduced. The implementation for this simulated project is written with MATLAB.7.5. First of all, two different original images are used to validate the proposed design. The developed program is used to transform the original images data into digital image file. Finally, the proposed program is implemented to encrypt and decrypt images data. Several tests are done to prove the design performance, including visual tests and security evaluation.

Keywords: Achterbahn-128, keystream generator, stream cipher, image encryption, security analysis

Procedia PDF Downloads 310
22137 Computational Intelligence and Machine Learning for Urban Drainage Infrastructure Asset Management

Authors: Thewodros K. Geberemariam

Abstract:

The rapid physical expansion of urbanization coupled with aging infrastructure presents a unique decision and management challenges for many big city municipalities. Cities must therefore upgrade and maintain the existing aging urban drainage infrastructure systems to keep up with the demands. Given the overall contribution of assets to municipal revenue and the importance of infrastructure to the success of a livable city, many municipalities are currently looking for a robust and smart urban drainage infrastructure asset management solution that combines management, financial, engineering and technical practices. This robust decision-making shall rely on sound, complete, current and relevant data that enables asset valuation, impairment testing, lifecycle modeling, and forecasting across the multiple asset portfolios. On this paper, predictive computational intelligence (CI) and multi-class machine learning (ML) coupled with online, offline, and historical record data that are collected from an array of multi-parameter sensors are used for the extraction of different operational and non-conforming patterns hidden in structured and unstructured data to determine and produce actionable insight on the current and future states of the network. This paper aims to improve the strategic decision-making process by identifying all possible alternatives; evaluate the risk of each alternative, and choose the alternative most likely to attain the required goal in a cost-effective manner using historical and near real-time urban drainage infrastructure data for urban drainage infrastructures assets that have previously not benefited from computational intelligence and machine learning advancements.

Keywords: computational intelligence, machine learning, urban drainage infrastructure, machine learning, classification, prediction, asset management space

Procedia PDF Downloads 148
22136 Classification of Forest Types Using Remote Sensing and Self-Organizing Maps

Authors: Wanderson Goncalves e Goncalves, José Alberto Silva de Sá

Abstract:

Human actions are a threat to the balance and conservation of the Amazon forest. Therefore the environmental monitoring services play an important role as the preservation and maintenance of this environment. This study classified forest types using data from a forest inventory provided by the 'Florestal e da Biodiversidade do Estado do Pará' (IDEFLOR-BIO), located between the municipalities of Santarém, Juruti and Aveiro, in the state of Pará, Brazil, covering an area approximately of 600,000 hectares, Bands 3, 4 and 5 of the TM-Landsat satellite image, and Self - Organizing Maps. The information from the satellite images was extracted using QGIS software 2.8.1 Wien and was used as a database for training the neural network. The midpoints of each sample of forest inventory have been linked to images. Later the Digital Numbers of the pixels have been extracted, composing the database that fed the training process and testing of the classifier. The neural network was trained to classify two forest types: Rain Forest of Lowland Emerging Canopy (Dbe) and Rain Forest of Lowland Emerging Canopy plus Open with palm trees (Dbe + Abp) in the Mamuru Arapiuns glebes of Pará State, and the number of examples in the training data set was 400, 200 examples for each class (Dbe and Dbe + Abp), and the size of the test data set was 100, with 50 examples for each class (Dbe and Dbe + Abp). Therefore, total mass of data consisted of 500 examples. The classifier was compiled in Orange Data Mining 2.7 Software and was evaluated in terms of the confusion matrix indicators. The results of the classifier were considered satisfactory, and being obtained values of the global accuracy equal to 89% and Kappa coefficient equal to 78% and F1 score equal to 0,88. It evaluated also the efficiency of the classifier by the ROC plot (receiver operating characteristics), obtaining results close to ideal ratings, showing it to be a very good classifier, and demonstrating the potential of this methodology to provide ecosystem services, particularly in anthropogenic areas in the Amazon.

Keywords: artificial neural network, computational intelligence, pattern recognition, unsupervised learning

Procedia PDF Downloads 358
22135 Climate Change: A Critical Analysis on the Relationship between Science and Policy

Authors: Paraskevi Liosatou

Abstract:

Climate change is considered to be of global concern being amplified by the fact that by its nature, cannot be spatially limited. This fact makes necessary the intergovernmental decision-making procedures. In the intergovernmental level, the institutions such as the United Nations Framework Convention on Climate Change and the Intergovernmental Panel on Climate Change develop efforts, methods, and practices in order to plan and suggest climate mitigation and adaptation measures. These measures are based on specific scientific findings and methods making clear the strong connection between science and policy. In particular, these scientific recommendations offer a series of practices, methods, and choices mitigating the problem by aiming at the indirect mitigation of the causes and the factors amplifying climate change. Moreover, modern production and economic context do not take into consideration the social, political, environmental and spatial dimensions of the problem. This work studies the decision-making process working in international and European level. In this context, this work considers the policy tools that have been implemented by various intergovernmental organizations. The methodology followed is based mainly on the critical study of standards and process concerning the connections and cooperation between science and policy as well as considering the skeptic debates developed. The finding of this work focuses on the links between science and policy developed by the institutional and scientific mechanisms concerning climate change mitigation. It also analyses the dimensions and the factors of the science-policy framework; in this way, it points out the causes that maintain skepticism in current scientific circles.

Keywords: climate change, climate change mitigation, climate change skepticism, IPCC, skepticism

Procedia PDF Downloads 134
22134 Clinical Validation of an Automated Natural Language Processing Algorithm for Finding COVID-19 Symptoms and Complications in Patient Notes

Authors: Karolina Wieczorek, Sophie Wiliams

Abstract:

Introduction: Patient data is often collected in Electronic Health Record Systems (EHR) for purposes such as providing care as well as reporting data. This information can be re-used to validate data models in clinical trials or in epidemiological studies. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. Mentioning a disease in a discharge letter does not necessarily mean that a patient suffers from this disease. Many of them discuss a diagnostic process, different tests, or discuss whether a patient has a certain disease. The COVID-19 dataset in this study used natural language processing (NLP), an automated algorithm which extracts information related to COVID-19 symptoms, complications, and medications prescribed within the hospital. Free-text patient clinical patient notes are rich sources of information which contain patient data not captured in a structured form, hence the use of named entity recognition (NER) to capture additional information. Methods: Patient data (discharge summary letters) were exported and screened by an algorithm to pick up relevant terms related to COVID-19. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. A list of 124 Systematized Nomenclature of Medicine (SNOMED) Clinical Terms has been provided in Excel with corresponding IDs. Two independent medical student researchers were provided with a dictionary of SNOMED list of terms to refer to when screening the notes. They worked on two separate datasets called "A” and "B”, respectively. Notes were screened to check if the correct term had been picked-up by the algorithm to ensure that negated terms were not picked up. Results: Its implementation in the hospital began on March 31, 2020, and the first EHR-derived extract was generated for use in an audit study on June 04, 2020. The dataset has contributed to large, priority clinical trials (including International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) by bulk upload to REDcap research databases) and local research and audit studies. Successful sharing of EHR-extracted datasets requires communicating the provenance and quality, including completeness and accuracy of this data. The results of the validation of the algorithm were the following: precision (0.907), recall (0.416), and F-score test (0.570). Percentage enhancement with NLP extracted terms compared to regular data extraction alone was low (0.3%) for relatively well-documented data such as previous medical history but higher (16.6%, 29.53%, 30.3%, 45.1%) for complications, presenting illness, chronic procedures, acute procedures respectively. Conclusions: This automated NLP algorithm is shown to be useful in facilitating patient data analysis and has the potential to be used in more large-scale clinical trials to assess potential study exclusion criteria for participants in the development of vaccines.

Keywords: automated, algorithm, NLP, COVID-19

Procedia PDF Downloads 97
22133 Solubility of Water in CO2 Mixtures at Pipeline Operation Conditions

Authors: Mohammad Ahmad, Sander Gersen, Erwin Wilbers

Abstract:

Carbon capture, transport and underground storage have become a major solution to reduce CO2 emissions from power plants and other large CO2 sources. A big part of this captured CO2 stream is transported at high pressure dense phase conditions and stored in offshore underground depleted oil and gas fields. CO2 is also transported in offshore pipelines to be used for enhanced oil and gas recovery. The captured CO2 stream with impurities may contain water that causes severe corrosion problems, flow assurance failure and might damage valves and instrumentations. Thus, free water formation should be strictly prevented. The purpose of this work is to study the solubility of water in pure CO2 and in CO2 mixtures under real pipeline pressure (90-150 bar) and temperature operation conditions (5-35°C). A set up was constructed to generate experimental data. The results show the solubility of water in CO2 mixtures increasing with the increase of the temperature or/and with the increase in pressure. A drop in water solubility in CO2 is observed in the presence of impurities. The data generated were then used to assess the capabilities of two mixture models: the GERG-2008 model and the EOS-CG model. By generating the solubility data, this study contributes to determine the maximum allowable water content in CO2 pipelines.

Keywords: carbon capture and storage, water solubility, equation of states, fluids engineering

Procedia PDF Downloads 293
22132 Climate Trends, Variability, and Impacts of El Niño-Southern Oscillation on Rainfall Amount in Ethiopia

Authors: Zerihun Yohannes Amare, Belayneh Birku Geremew, Nigatu Melise Kebede, Sisaynew Getahun Amera

Abstract:

In Ethiopia, agricultural production is predominantly rainfed. The El Niño Southern Oscillation (ENSO) is the driver of climate variability, which affects the agricultural production system in the country. This paper aims to study trends, variability of rainfall, and impacts of El Niño Southern Oscillation (ENSO) on rainfall amount. The study was carried out in Ethiopia's Western Amhara National Regional State, which features a variety of seasons that characterize the nation. Monthly rainfall data were collected from fifteen meteorological stations of Western Amhara. Selected El Niño and La Niña years were also extracted from National Oceanic and Atmospheric Administration (NOAA) from 1986 to 2015. Once the data quality was checked and inspected, the monthly rainfall data of the selected stations were arranged in Microsoft Excel Spreadsheet and analyzed using XLSTAT software. The coefficient of variation and the Mann-Kendall non-parametric statistical test was employed to analyze trends and variability of rainfall and temperature. The long-term recorded annual rainfall data indicated that there was an increasing trend from 1986 to 2015 insignificantly. The rainfall variability was less (Coefficient of Variation, CV = 8.6%); also, the mean monthly rainfall of Western Amhara decreased during El Niño years and increased during La Niña years, especially in the rainy season (JJAS) over 30 years. This finding will be useful to suggest possible adaptation strategies and efficient use of resources during planning and implementation.

Keywords: rainfall, Mann-Kendall test, El Niño, La Niña, Western Amhara, Ethiopia

Procedia PDF Downloads 86
22131 Phosphorus Uptake of Triticale (Triticosecale Wittmack) Genotypes at Different Growth Stages

Authors: Imren Kutlu, Nurdilek Gulmezoglu

Abstract:

Triticale (Triticosecale Wittmack) is a man-made crop developed by crossing wheat (Triticum L.) and rye (Secale cereale L.). Triticale has until now been used mostly for animal feed; however, it can be consumed by humans in the form of biscuits, cookies, and unleavened bread. Moreover, one of the reasons for the development of triticale is that it is more efficient in nutrient deficient soil than wheat cultivars. After nitrogen fertilizer, phosphorus (P) is the most used fertilizer for crop production because P fixation occurs highly when it is applied the soil. The aim of the present study was to evaluate P uptake of winter triticale genotypes under different P fertilizer rates in different growth stages. The experiment was conducted in Eskisehir, Central Anatolia, Turkey. Treatments consisted of five triticale lines and one triticale cultivars (Samursortu) with four rates of P fertilization (0, 30, 60 and 120 kg P2O5 ha⁻¹). Phosphorus uptake of triticale genotypes in tillering, heading, as well as grain and straw at harvest stage and yield of grain and straw were determined. The results showed that a P rate of 60 kg/ha and the TCL-25 genotype produced the highest yields of straw and grain at harvest. Phosphorus uptake was the highest in tillering stage, and it decreased towards to harvest time. Phosphorus uptake of all growth stage increased as P rates raised and the application of 120 kg/ha P₂O₅ had the highest P uptake. Phosphorus uptake of genotypes was found differently. The regression analyses indicated that P uptake at tillering stage was the most effective on grain yield. These results will provide useful information to triticale growers about suitable phosphorus fertilization for both forage and food usage.

Keywords: grain yield, growth stage, phosphorus fertilization, phosphorus uptake, triticale

Procedia PDF Downloads 142
22130 Modification Encryption Time and Permutation in Advanced Encryption Standard Algorithm

Authors: Dalal N. Hammod, Ekhlas K. Gbashi

Abstract:

Today, cryptography is used in many applications to achieve high security in data transmission and in real-time communications. AES has long gained global acceptance and is used for securing sensitive data in various industries but has suffered from slow processing and take a large time to transfer data. This paper suggests a method to enhance Advance Encryption Standard (AES) Algorithm based on time and permutation. The suggested method (MAES) is based on modifying the SubByte and ShiftRrows in the encryption part and modification the InvSubByte and InvShiftRows in the decryption part. After the implementation of the proposal and testing the results, the Modified AES achieved good results in accomplishing the communication with high performance criteria in terms of randomness, encryption time, storage space, and avalanche effects. The proposed method has good randomness to ciphertext because this method passed NIST statistical tests against attacks; also, (MAES) reduced the encryption time by (10 %) than the time of the original AES; therefore, the modified AES is faster than the original AES. Also, the proposed method showed good results in memory utilization where the value is (54.36) for the MAES, but the value for the original AES is (66.23). Also, the avalanche effects used for calculating diffusion property are (52.08%) for the modified AES and (51.82%) percentage for the original AES.

Keywords: modified AES, randomness test, encryption time, avalanche effects

Procedia PDF Downloads 240
22129 Analysis of Secondary School Students' Perceptions about Information Technologies through a Word Association Test

Authors: Fetah Eren, Ismail Sahin, Ismail Celik, Ahmet Oguz Akturk

Abstract:

The aim of this study is to discover secondary school students’ perceptions related to information technologies and the connections between concepts in their cognitive structures. A word association test consisting of six concepts related to information technologies is used to collect data from 244 secondary school students. Concept maps that present students’ cognitive structures are drawn with the help of frequency data. Data are analyzed and interpreted according to the connections obtained as a result of the concept maps. It is determined students associate most with these concepts—computer, Internet, and communication of the given concepts, and associate least with these concepts—computer-assisted education and information technologies. These results show the concepts, Internet, communication, and computer, are an important part of students’ cognitive structures. In addition, students mostly answer computer, phone, game, Internet and Facebook as the key concepts. These answers show students regard information technologies as a means for entertainment and free time activity, not as a means for education.

Keywords: word association test, cognitive structure, information technology, secondary school

Procedia PDF Downloads 409
22128 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 75
22127 Big Data Analysis on the Development of Jinan’s Consumption Centers under the Influence of E-Commerce

Authors: Hang Wang, Xiaoming Gao

Abstract:

The rapid development of e-commerce has significantly transformed consumer behavior and urban consumption patterns worldwide. This study explores the impact of e-commerce on the development and spatial distribution of consumption centers, with a particular focus on Jinan City, China. Traditionally, urban consumption centers are defined by physical commercial spaces, such as shopping malls and markets. However, the rise of e-commerce has introduced a shift towards virtual consumption hubs, with a corresponding impact on physical retail locations. Utilizing Gaode POI (Point of Interest) data, this research aims to provide a comprehensive analysis of the spatial distribution of consumption centers in Jinan, comparing e-commerce-driven virtual consumption hubs with traditional physical consumption centers. The study methodology involves gathering and analyzing POI data, focusing on logistics distribution for e-commerce activities and mobile charging point locations to represent offline consumption behavior. A spatial clustering technique is applied to examine the concentration of commercial activities and to identify emerging trends in consumption patterns. The findings reveal a clear differentiation between e-commerce and physical consumption centers in Jinan. E-commerce activities are dispersed across a wider geographic area, correlating closely with residential zones and logistics centers, while traditional consumption hubs remain concentrated around historical and commercial areas such as Honglou and the old city center. Additionally, the research identifies an ongoing transition within Jinan’s consumption landscape, with online and offline retail coexisting, though at different spatial and functional levels. This study contributes to urban planning by providing insights into how e-commerce is reshaping consumption behaviors and spatial structures in cities like Jinan. By leveraging big data analytics, the research offers a valuable tool for urban designers and planners to adapt to the evolving demands of digital commerce and to optimize the spatial layout of city infrastructure to better serve the needs of modern consumers.

Keywords: big data, consumption centers, e-commerce, urban planning, jinan

Procedia PDF Downloads 10
22126 The Potential Threat of Cyberterrorism to the National Security: Theoretical Framework

Authors: Abdulrahman S. Alqahtani

Abstract:

The revolution of computing and networks could revolutionise terrorism in the same way that it has brought about changes in other aspects of life. The modern technological era has faced countries with a new set of security challenges. There are many states and potential adversaries who have the potential and capacity in cyberspace, which makes them able to carry out cyber-attacks in the future. Some of them are currently conducting surveillance, gathering and analysis of technical information, and mapping of networks and nodes and infrastructure of opponents, which may be exploited in future conflicts. This poster presents the results of the quantitative study (survey) to test the validity of the proposed theoretical framework for the cyber terrorist threats. This theoretical framework will help to in-depth understand these new digital terrorist threats. It may also be a practical guide for managers and technicians in critical infrastructure, to understand and assess the threats they face. It might also be the foundation for building a national strategy to counter cyberterrorism. In the beginning, it provides basic information about the data. To purify the data, reliability and exploratory factor analysis, as well as confirmatory factor analysis (CFA) were performed. Then, Structural Equation Modelling (SEM) was utilised to test the final model of the theory and to assess the overall goodness-of-fit between the proposed model and the collected data set.

Keywords: cyberterrorism, critical infrastructure, , national security, theoretical framework, terrorism

Procedia PDF Downloads 396
22125 Encryption and Decryption of Nucleic Acid Using Deoxyribonucleic Acid Algorithm

Authors: Iftikhar A. Tayubi, Aabdulrahman Alsubhi, Abdullah Althrwi

Abstract:

The deoxyribonucleic acid text provides a single source of high-quality Cryptography about Deoxyribonucleic acid sequence for structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to encrypt and decrypt Deoxy Ribonucleic Acid sequence text. It includes complex, securing by using Algorithm to encrypt and decrypt Deoxy Ribonucleic Acid sequence. The utility of this Deoxy Ribonucleic Acid Sequence Text is that, it can provide a user-friendly interface for users to Encrypt and Decrypt store the information about Deoxy Ribonucleic Acid sequence. These interfaces created in this project will satisfy the demands of the scientific community by providing fully encrypt of Deoxy Ribonucleic Acid sequence during this website. We have adopted a methodology by using C# and Active Server Page.NET for programming which is smart and secure. Deoxy Ribonucleic Acid sequence text is a wonderful piece of equipment for encrypting large quantities of data, efficiently. The users can thus navigate from one encoding and store orange text, depending on the field for user’s interest. Algorithm classification allows a user to Protect the deoxy ribonucleic acid sequence from change, whether an alteration or error occurred during the Deoxy Ribonucleic Acid sequence data transfer. It will check the integrity of the Deoxy Ribonucleic Acid sequence data during the access.

Keywords: algorithm, ASP.NET, DNA, encrypt, decrypt

Procedia PDF Downloads 226
22124 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics

Authors: L. Freeborn

Abstract:

Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.

Keywords: neuroimaging studies, research design, second language acquisition, task validity

Procedia PDF Downloads 134
22123 The Benefits of Using Hijab Syar'i against Female Sexual Abuse

Authors: Catur Sigit Hartanto, Anggraeni Anisa Wara Rahmayanti

Abstract:

Objective: This research is aimed to assess the benefits of using hijab syar'i against female sexual abuse. Method: This research uses a quantitative study. The population is students in Semarang State University who wear hijab syar’i. The sampling technique uses the method of conformity. The retrieving data uses questionnaire on 30 female students as the sample. The data analysis uses descriptive analysis. Result: Using hijab syar’i provides benefits in preventing and minimizing female sexual abuse. Limitation: Respondents were limited to only 30 people.

Keywords: hijab syar’i, female, sexual abuse, student of Semarang State University

Procedia PDF Downloads 280
22122 Observation and Study of Landslides Affecting the Tangier: Oued Rmel Motorway Segment

Authors: S. Houssaini, L. Bahi

Abstract:

The motorway segment between Tangier and Oued R’mel has experienced, since the beginning of building works, significant instability and landslides linked to a number of geological, hydrogeological and geothermic factors affecting the different formations. The landslides observed are not fully understood, despite many studies conducted on this segment. This study aims at producing new methods to better explain the phenomena behind the landslides, taking into account the geotechnical and geothermic contexts. This analysis builds up on previous studies and geotechnical data collected in the field. The final body of data collected shall be processed through the Plaxis software for a better and customizable view of the landslide problems in the area, which will help to find solutions and stabilize land in the area.

Keywords: landslides, modeling, risk, stabilization

Procedia PDF Downloads 193
22121 Estimation of Train Operation Using an Exponential Smoothing Method

Authors: Taiyo Matsumura, Kuninori Takahashi, Takashi Ono

Abstract:

The purpose of this research is to improve the convenience of waiting for trains at level crossings and stations and to prevent accidents resulting from forcible entry into level crossings, by providing level crossing users and passengers with information that tells them when the next train will pass through or arrive. For this paper, we proposed methods for estimating operation by means of an average value method, variable response smoothing method, and exponential smoothing method, on the basis of open data, which has low accuracy, but for which performance schedules are distributed in real time. We then examined the accuracy of the estimations. The results showed that the application of an exponential smoothing method is valid.

Keywords: exponential smoothing method, open data, operation estimation, train schedule

Procedia PDF Downloads 385
22120 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society

Authors: Irene Yi

Abstract:

Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.

Keywords: computational analysis, gendered grammar, misogynistic language, neural networks

Procedia PDF Downloads 116
22119 Regression Analysis in Estimating Stream-Flow and the Effect of Hierarchical Clustering Analysis: A Case Study in Euphrates-Tigris Basin

Authors: Goksel Ezgi Guzey, Bihrat Onoz

Abstract:

The scarcity of streamflow gauging stations and the increasing effects of global warming cause designing water management systems to be very difficult. This study is a significant contribution to assessing regional regression models for estimating streamflow. In this study, simulated meteorological data was related to the observed streamflow data from 1971 to 2020 for 33 stream gauging stations of the Euphrates-Tigris Basin. Ordinary least squares regression was used to predict flow for 2020-2100 with the simulated meteorological data. CORDEX- EURO and CORDEX-MENA domains were used with 0.11 and 0.22 grids, respectively, to estimate climate conditions under certain climate scenarios. Twelve meteorological variables simulated by two regional climate models, RCA4 and RegCM4, were used as independent variables in the ordinary least squares regression, where the observed streamflow was the dependent variable. The variability of streamflow was then calculated with 5-6 meteorological variables and watershed characteristics such as area and height prior to the application. Of the regression analysis of 31 stream gauging stations' data, the stations were subjected to a clustering analysis, which grouped the stations in two clusters in terms of their hydrometeorological properties. Two streamflow equations were found for the two clusters of stream gauging stations for every domain and every regional climate model, which increased the efficiency of streamflow estimation by a range of 10-15% for all the models. This study underlines the importance of homogeneity of a region in estimating streamflow not only in terms of the geographical location but also in terms of the meteorological characteristics of that region.

Keywords: hydrology, streamflow estimation, climate change, hydrologic modeling, HBV, hydropower

Procedia PDF Downloads 124
22118 Behavioral Response of Bee Farmers to Climate Change in South East, Nigeria

Authors: Jude A. Mbanasor, Chigozirim N. Onwusiribe

Abstract:

The enigma climate change is no longer an illusion but a reality. In the recent years, the Nigeria climate has changed and the changes are shown by the changing patterns of rainfall, the sunshine, increasing level carbon and nitrous emission as well as deforestation. This study analyzed the behavioural response of bee keepers to variations in the climate and the adaptation techniques developed in response to the climate variation. Beekeeping is a viable economic activity for the alleviation of poverty as the products include honey, wax, pollen, propolis, royal jelly, venom, queens, bees and their larvae and are all marketable. The study adopted the multistage sampling technique to select 120 beekeepers from the five states of Southeast Nigeria. Well-structured questionnaires and focus group discussions were adopted to collect the required data. Statistical tools like the Principal component analysis, data envelopment models, graphs, and charts were used for the data analysis. Changing patterns of rainfall and sunshine with the increasing rate of deforestation had a negative effect on the habitat of the bees. The bee keepers have adopted the Kenya Top bar and Langstroth hives and they establish the bee hives on fallow farmland close to the cultivated communal farms with more flowering crops.

Keywords: climate, farmer, response, smart

Procedia PDF Downloads 128
22117 Disaster Resilience Analysis of Atlanta Interstate Highway System within the Perimeter

Authors: Mengmeng Liu, J. David Frost

Abstract:

Interstate highway system within the Atlanta Perimeter plays an important role in residents’ daily life. The serious influence of Atlanta I-85 Collapses implies that transportation system in the region lacks a cohesive and comprehensive transportation plan. Therefore, disaster resilience analysis of the transportation system is necessary. Resilience is the system’s capability to persist or to maintain transportation services when exposed to changes or shocks. This paper analyzed the resilience of the whole transportation system within the Perimeter and see how removing interstates within the Perimeter will affect the resilience of the transportation system. The data used in the paper are Atlanta transportation networks and LEHD Origin-Destination Employment Statistics data. First, we calculate the traffic flow on each road section based on LEHD data assuming each trip travel along the shortest travel time paths. Second, we calculate the measure of resilience, which is flow-based connectivity and centrality of the transportation network, and see how they will change if we remove each section of interstates from the current transportation system. Finally, we get the resilience function curve of the interstates and identify the most resilient interstates section. The resilience analysis results show that the framework of calculation resilience is effective and can provide some useful information for the transportation planning and sustainability analysis of the transportation infrastructures.

Keywords: connectivity, interstate highway system, network analysis, resilience analysis

Procedia PDF Downloads 254
22116 Analyzing Migration Patterns Using Public Disorder Event Data

Authors: Marie E. Docken

Abstract:

At some point in the lifecycle of a country, patterns of political and social unrest of varying degrees are observed. Events involving public disorder or civil disobedience may produce effects that range a wide spectrum of varying outcomes, depending on the level of unrest. Many previous studies, primarily theoretical in nature, have attempted to measure public disorder in answering why or how it occurs in society by examining causal factors or underlying issues in the social or political position of a population. The main objective in doing so is to understand how these activities evolve or seek some predictive capability for the events. In contrast, this research involves the fusion of analytics and social studies to provide more knowledge of the public disorder and civil disobedience intensity in populations. With a greater understanding of the magnitude of these events, it is believed that we may learn how they relate to extreme actions such as mass migration or violence. Upon establishing a model for measuring civil unrest based upon empirical data, a case study on various Latin American countries is performed. Interpretations of historical events are combined with analytical results to provide insights regarding the magnitude and effect of social and political activism.

Keywords: public disorder, civil disobedience, Latin America, metrics, data analysis

Procedia PDF Downloads 142
22115 AI as a Tool Hindering Digital Education

Authors: Justyna Żywiołek, Marek Matulewski

Abstract:

The article presents the results of a survey conducted among students from various European countries. The aim of the study was to understand how artificial intelligence (AI) affects educational processes in a digital environment. The survey covered a wide range of topics, including students' understanding and use of AI, its impact on motivation and engagement, interaction and support issues, accessibility and equity, and data security and privacy concerns. Most respondents admitted having difficulties comprehending the advanced functions of AI in educational tools. Many students believe that excessive use of AI in education can decrease their motivation for self-study and active participation in classes. Additionally, students reported that interaction with AI-based tools is often less satisfying compared to direct contact with teachers. Furthermore, the survey highlighted inequalities in access to advanced AI tools, which can widen the educational gap between students from different economic backgrounds. Students also expressed concerns about the security and privacy of their personal data collected and processed by AI systems. The findings suggest that while AI has the potential to support digital education, significant challenges need to be addressed to make these tools more effective and acceptable for students. Recommendations include increasing training for students and teachers on using AI, providing more interactive and engaging forms of education, and implementing stricter regulations on data protection.

Keywords: AI, digital education, education tools, motivation and engagement

Procedia PDF Downloads 22
22114 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 286
22113 Adsorption of Paracetamol Using Activated Carbon of Dende and Babassu Coconut Mesocarp

Authors: R. C. Ferreira, H. H. C. De Lima, A. A. Cândido, O. M. Couto Junior, P. A. Arroyo, K. Q De Carvalho, G. F. Gauze, M. A. S. D. Barros

Abstract:

Removal of the widespread used drug paracetamol from water was investigated using activated carbon originated from dende coconut mesocarp and babassu coconut mesocarp. Kinetic and equilibrium data were obtained at different values of pH. Babassu activated carbon showed higher efficiency due to its acidity and higher microporosity. Pseudo-second order model was better adjusted to the kinetic results. Equilibrium data may be represented by Langmuir equation. Lower solution pH provided better removal efficiency as the carbonil groups may be attracted by the positively charged carbon surface.

Keywords: adsorption, activated carbon, babassu, dende

Procedia PDF Downloads 368
22112 Knowledge and Eating Behavior of Teenage Pregnancy

Authors: Udomporn Yingpaisuk, Premwadee Karuhadej

Abstract:

The purposed of this research was to study the eating habit of teenage pregnancy and its relationship to the knowledge of nutrition during pregnancy. The 100 samples were derived from simple random sampling technique of the teenage pregnancy in Bangkae District. The questionnaire was used to collect data with the reliability of 0.8. The data were analyzed by SPSS for Windows with multiple regression technique. Percentage, mean and the relationship of knowledge of eating and eating behavior were obtained. The research results revealed that their knowledge in nutrition was at the average of 4.07 and their eating habit that they mentioned most was to refrain from alcohol and caffeine at 82% and the knowledge in nutrition influenced their eating habits at 54% with the statistically significant level of 0.001.

Keywords: teenage pregnancy, knowledge of eating, eating behavior, alcohol, caffeine

Procedia PDF Downloads 354
22111 Quantification of Magnetic Resonance Elastography for Tissue Shear Modulus using U-Net Trained with Finite-Differential Time-Domain Simulation

Authors: Jiaying Zhang, Xin Mu, Chang Ni, Jeff L. Zhang

Abstract:

Magnetic resonance elastography (MRE) non-invasively assesses tissue elastic properties, such as shear modulus, by measuring tissue’s displacement in response to mechanical waves. The estimated metrics on tissue elasticity or stiffness have been shown to be valuable for monitoring physiologic or pathophysiologic status of tissue, such as a tumor or fatty liver. To quantify tissue shear modulus from MRE-acquired displacements (essentially an inverse problem), multiple approaches have been proposed, including Local Frequency Estimation (LFE) and Direct Inversion (DI). However, one common problem with these methods is that the estimates are severely noise-sensitive due to either the inverse-problem nature or noise propagation in the pixel-by-pixel process. With the advent of deep learning (DL) and its promise in solving inverse problems, a few groups in the field of MRE have explored the feasibility of using DL methods for quantifying shear modulus from MRE data. Most of the groups chose to use real MRE data for DL model training and to cut training images into smaller patches, which enriches feature characteristics of training data but inevitably increases computation time and results in outcomes with patched patterns. In this study, simulated wave images generated by Finite Differential Time Domain (FDTD) simulation are used for network training, and U-Net is used to extract features from each training image without cutting it into patches. The use of simulated data for model training has the flexibility of customizing training datasets to match specific applications. The proposed method aimed to estimate tissue shear modulus from MRE data with high robustness to noise and high model-training efficiency. Specifically, a set of 3000 maps of shear modulus (with a range of 1 kPa to 15 kPa) containing randomly positioned objects were simulated, and their corresponding wave images were generated. The two types of data were fed into the training of a U-Net model as its output and input, respectively. For an independently simulated set of 1000 images, the performance of the proposed method against DI and LFE was compared by the relative errors (root mean square error or RMSE divided by averaged shear modulus) between the true shear modulus map and the estimated ones. The results showed that the estimated shear modulus by the proposed method achieved a relative error of 4.91%±0.66%, substantially lower than 78.20%±1.11% by LFE. Using simulated data, the proposed method significantly outperformed LFE and DI in resilience to increasing noise levels and in resolving fine changes of shear modulus. The feasibility of the proposed method was also tested on MRE data acquired from phantoms and from human calf muscles, resulting in maps of shear modulus with low noise. In future work, the method’s performance on phantom and its repeatability on human data will be tested in a more quantitative manner. In conclusion, the proposed method showed much promise in quantifying tissue shear modulus from MRE with high robustness and efficiency.

Keywords: deep learning, magnetic resonance elastography, magnetic resonance imaging, shear modulus estimation

Procedia PDF Downloads 59