Search results for: computer assisted learning language
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11943

Search results for: computer assisted learning language

1473 Effect of Fill Material Density under Structures on Ground Motion Characteristics Due to Earthquake

Authors: Ahmed T. Farid, Khaled Z. Soliman

Abstract:

Due to limited areas and excessive cost of land for projects, backfilling process has become necessary. Also, backfilling will be done to overcome the un-leveling depths or raising levels of site construction, especially near the sea region. Therefore, backfilling soil materials used under the foundation of structures should be investigated regarding its effect on ground motion characteristics, especially at regions subjected to earthquakes. In this research, 60-meter thickness of sandy fill material was used above a fixed 240-meter of natural clayey soil underlying by rock formation to predict the modified ground motion characteristics effect at the foundation level. Comparison between the effect of using three different situations of fill material compaction on the recorded earthquake is studied, i.e. peak ground acceleration, time history, and spectra acceleration values. The three different densities of the compacted fill material used in the study were very loose, medium dense and very dense sand deposits, respectively. Shake computer program was used to perform this study. Strong earthquake records, with Peak Ground Acceleration (PGA) of 0.35 g, were used in the analysis. It was found that, higher compaction of fill material thickness has a significant effect on eliminating the earthquake ground motion properties at surface layer of fill material, near foundation level. It is recommended to consider the fill material characteristics in the design of foundations subjected to seismic motions. Future studies should be analyzed for different fill and natural soil deposits for different seismic conditions.

Keywords: acceleration, backfill, earthquake, soil, PGA

Procedia PDF Downloads 380
1472 Reducing Later Life Loneliness: A Systematic Literature Review of Loneliness Interventions

Authors: Dhruv Sharma, Lynne Blair, Stephen Clune

Abstract:

Later life loneliness is a social issue that is increasing alongside an upward global population trend. As a society, one way that we have responded to this social challenge is through developing non-pharmacological interventions such as befriending services, activity clubs, meet-ups, etc. Through a systematic literature review, this paper suggests that currently there is an underrepresentation of radical innovation, and underutilization of digital technologies in developing loneliness interventions for older adults. This paper examines intervention studies that were published in English language, within peer reviewed journals between January 2005 and December 2014 across 4 electronic databases. In addition to academic databases, interventions found in grey literature in the form of websites, blogs, and Twitter were also included in the overall review. This approach yielded 129 interventions that were included in the study. A systematic approach allowed the minimization of any bias dictating the selection of interventions to study. A coding strategy based on a pattern analysis approach was devised to be able to compare and contrast the loneliness interventions. Firstly, interventions were categorized on the basis of their objective to identify whether they were preventative, supportive, or remedial in nature. Secondly, depending on their scope, they were categorized as one-to-one, community-based, or group based. It was also ascertained whether interventions represented an improvement, an incremental innovation, a major advance or a radical departure, in comparison to the most basic form of a loneliness intervention. Finally, interventions were also assessed on the basis of the extent to which they utilized digital technologies. Individual visualizations representing the four levels of coding were created for each intervention, followed by an aggregated visual to facilitate analysis. To keep the inquiry within scope and to present a coherent view of the findings, the analysis was primarily concerned the level of innovation, and the use of digital technologies. This analysis highlights a weak but positive correlation between the level of innovation and the use of digital technologies in designing and deploying loneliness interventions, and also emphasizes how certain existing interventions could be tweaked to enable their migration from representing incremental innovation to radical innovation for example. This analysis also points out the value of including grey literature, especially from Twitter, in systematic literature reviews to get a contemporary view of latest work in the area under investigation.

Keywords: ageing, loneliness, innovation, digital

Procedia PDF Downloads 122
1471 Production of Composite Materials by Mixing Chromium-Rich Ash and Soda-Lime Glass Powder: Mechanical Properties and Microstructure

Authors: Savvas Varitis, Panagiotis Kavouras, George Vourlias, Eleni Pavlidou, Theodoros Karakostas, Philomela Komninou

Abstract:

A chromium-loaded ash originating from incineration of tannery sludge under anoxic conditions was mixed with low grade soda-lime glass powder coming from commercial glass bottles. The relative weight proportions of ash over glass powder tested were 30/70, 40/60 and 50/50. The solid mixtures, formed in green state compacts, were sintered at the temperature range of 800oC up to 1200oC. The resulting products were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive X-ray spectrometry (EDXS) and micro-indentation. The above methods were employed to characterize the various phases, microstructure and hardness of the produced materials. Thermal treatment at 800oC and 1000oC produced opaque ceramic products composed of a variety of chromium-containing and chromium-free crystalline phases. Thermal treatment at 1200oC gave rise to composite products, where only chromium-containing crystalline phases were detected. Hardness results suggest that specific products are serious candidates for structural applications. Acknowledgement: This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program “Education and Lifelong Learning” of the National Strategic Reference Framework (NSRF) – Research Funding Program: THALES “WasteVal”: Reinforcement of the interdisciplinary and/or inter-institutional research and innovation.

Keywords: chromium-rich tannery residues, glass-ceramic materials, mechanical properties, microstructure

Procedia PDF Downloads 342
1470 Application of GPRS in Water Quality Monitoring System

Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan

Abstract:

Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.

Keywords: multiparameter sensor, GPRS, visual basic software, RS232

Procedia PDF Downloads 412
1469 Risk of Fatal and Non-Fatal Coronary Heart Disease and Stroke Events among Adult Patients with Hypertension: Basic Markov Model Inputs for Evaluating Cost-Effectiveness of Hypertension Treatment: Systematic Review of Cohort Studies

Authors: Mende Mensa Sorato, Majid Davari, Abbas Kebriaeezadeh, Nizal Sarrafzadegan, Tamiru Shibru, Behzad Fatemi

Abstract:

Markov model, like cardiovascular disease (CVD) policy model based simulation, is being used for evaluating the cost-effectiveness of hypertension treatment. Stroke, angina, myocardial infarction (MI), cardiac arrest, and all-cause mortality were included in this model. Hypertension is a risk factor for a number of vascular and cardiac complications and CVD outcomes. Objective: This systematic review was conducted to evaluate the comprehensiveness of this model across different regions globally. Methods: We searched articles written in the English language from PubMed/Medline, Ovid/Medline, Embase, Scopus, Web of Science, and Google scholar with a systematic search query. Results: Thirteen cohort studies involving a total of 2,165,770 (1,666,554 hypertensive adult population and 499,226 adults with treatment-resistant hypertension) were included in this scoping review. Hypertension is clearly associated with coronary heart disease (CHD) and stroke mortality, unstable angina, stable angina, MI, heart failure (HF), sudden cardiac death, transient ischemic attack, ischemic stroke, subarachnoid hemorrhage, intracranial hemorrhage, peripheral arterial disease (PAD), and abdominal aortic aneurism (AAA). Association between HF and hypertension is variable across regions. Treatment resistant hypertension is associated with a higher relative risk of developing major cardiovascular events and all-cause mortality when compared with non-resistant hypertension. However, it is not included in the previous CVD policy model. Conclusion: The CVD policy model used can be used in most regions for the evaluation of the cost-effectiveness of hypertension treatment. However, hypertension is highly associated with HF in Latin America, the Caribbean, Eastern Europe, and Sub-Saharan Africa. Therefore, it is important to consider HF in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment in these regions. We do not suggest the inclusion of PAD and AAA in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment due to a lack of sufficient evidence. Researchers should consider the effect of treatment-resistant hypertension either by including it in the basic model or during setting the model assumptions.

Keywords: cardiovascular disease policy model, cost-effectiveness analysis, hypertension, systematic review, twelve major cardiovascular events

Procedia PDF Downloads 71
1468 Analysis of Senior Secondary II Students Performance/Approaches Exhibited in Solving Circle Geometry

Authors: Mukhtari Hussaini Muhammad, Abba Adamu

Abstract:

The paper will examine the approaches and solutions that will be offered by Senior Secondary School II Students (Demonstration Secondary School, Azare Bauchi State Northern Nigeria – Hausa/ Fulani predominant area) toward solving exercises related to the circle theorem. The angle that an arc of a circle subtends at the center is twice that which it subtends at any point on the remaining part of the circumference. The Students will be divided in to 2 groups by given them numbers 1, 2; 1, 2; 1, 2, then all 1s formed group I and all 2s formed group II. Group I will be considered as control group in which the traditional method will be applied during instructions. Thus, the researcher will revise the concept of circle, state the theorem, prove the theorem and then solve examples. Group II, experimental group in which the concept of circle will be revised to the students and then the students will be asked to draw different circles, mark arcs, draw angle at the center, angle at the circumference then measure the angles constructed. The students will be asked to explain what they can infer/deduce from the angles measured and lastly, examples will be solved. During the next contact day, both groups will be subjected to solving exercises in the classroom related to the theorem. The angle that an arc of a circle subtends at the center is twice that which it subtends at any point on the remaining part of circumference. The solution to the exercises will be marked, the scores compared/analysed using relevant statistical tool. It is expected that group II will perform better because of the method/ technique followed during instructions is more learner-centered. By exploiting the talents of the individual learners through listening to the views and asking them how they arrived at a solution will really improve learning and understanding.

Keywords: circle theorem, control group, experimental group, traditional method

Procedia PDF Downloads 192
1467 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems

Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos

Abstract:

Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.

Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system

Procedia PDF Downloads 183
1466 Finding the Longest Common Subsequence in Normal DNA and Disease Affected Human DNA Using Self Organizing Map

Authors: G. Tamilpavai, C. Vishnuppriya

Abstract:

Bioinformatics is an active research area which combines biological matter as well as computer science research. The longest common subsequence (LCSS) is one of the major challenges in various bioinformatics applications. The computation of the LCSS plays a vital role in biomedicine and also it is an essential task in DNA sequence analysis in genetics. It includes wide range of disease diagnosing steps. The objective of this proposed system is to find the longest common subsequence which presents in a normal and various disease affected human DNA sequence using Self Organizing Map (SOM) and LCSS. The human DNA sequence is collected from National Center for Biotechnology Information (NCBI) database. Initially, the human DNA sequence is separated as k-mer using k-mer separation rule. Mean and median values are calculated from each separated k-mer. These calculated values are fed as input to the Self Organizing Map for the purpose of clustering. Then obtained clusters are given to the Longest Common Sub Sequence (LCSS) algorithm for finding common subsequence which presents in every clusters. It returns nx(n-1)/2 subsequence for each cluster where n is number of k-mer in a specific cluster. Experimental outcomes of this proposed system produce the possible number of longest common subsequence of normal and disease affected DNA data. Thus the proposed system will be a good initiative aid for finding disease causing sequence. Finally, performance analysis is carried out for different DNA sequences. The obtained values show that the retrieval of LCSS is done in a shorter time than the existing system.

Keywords: clustering, k-mers, longest common subsequence, SOM

Procedia PDF Downloads 267
1465 Sustainability of Vernacular Architecture in Zegalli Houses in Northern Iran with Emphasis on Their Seismic Behavior

Authors: Mona Zaryoun, Mahmood Hosseini, Seyed Mohammad Hassan Khalkhali, Haniyeh Okhovat

Abstract:

Zegalli houses in Guilan province, northern Iran, are a type of vernacular houses which their foundation, skeleton and walls all have been made of wood. The only houses which could survive the major Manjil-Rudbar earthquake of 1990 with a magnitude of 7.2 were these houses. Regarding this fact, some researchers started thinking of this type of foundations used in these houses to benefit from rocking-wise behavior. On the one hand, the relatively light weight of the houses, have helped these houses to withstand well against seismic excitations. In this paper at first a brief description of Zegalli houses and their architectural features, with emphasis on their foundation is presented. in the next stage foundation of one of these houses is modeled as a sample by a using a computer program, which has been developed in MATLAB environment, and by using the horizontal and vertical accelerograms of a set of selected site compatible earthquakes, a series of time history analysis (THA) are carried out to investigate the behavior of this type of houses against earthquake. Based on numerical results of THA it can be said that even without no sliding at the foundation timbers, only due to the rocking which occurs in various levels of the foundation the seismic response of the house is significantly reduced., which results in their stability subjected to earthquakes with peak ground acceleration of around 0.35g. Therefore, it can be recommended the Zegalli houses are considered as sustainable Iranian vernacular architecture, and it can be recommended that the use of these houses and their architecture and their structural merits are reconsidered by architects as well as civil and structural engineers.

Keywords: MATLAB software, rocking behavior, time history analysis, Zegalli houses

Procedia PDF Downloads 288
1464 Utilizing Topic Modelling for Assessing Mhealth App’s Risks to Users’ Health before and during the COVID-19 Pandemic

Authors: Pedro Augusto Da Silva E Souza Miranda, Niloofar Jalali, Shweta Mistry

Abstract:

BACKGROUND: Software developers utilize automated solutions to scrape users’ reviews to extract meaningful knowledge to identify problems (e.g., bugs, compatibility issues) and possible enhancements (e.g., users’ requests) to their solutions. However, most of these solutions do not consider the health risk aspects to users. Recent works have shed light on the importance of including health risk considerations in the development cycle of mHealth apps to prevent harm to its users. PROBLEM: The COVID-19 Pandemic in Canada (and World) is currently forcing physical distancing upon the general population. This new lifestyle made the usage of mHealth applications more essential than ever, with a projected market forecast of 332 billion dollars by 2025. However, this new insurgency in mHealth usage comes with possible risks to users’ health due to mHealth apps problems (e.g., wrong insulin dosage indication due to a UI error). OBJECTIVE: These works aim to raise awareness amongst mHealth developers of the importance of considering risks to users’ health within their development lifecycle. Moreover, this work also aims to help mHealth developers with a Proof-of-Concept (POC) solution to understand, process, and identify possible health risks to users of mHealth apps based on users’ reviews. METHODS: We conducted a mixed-method study design. We developed a crawler to mine the negative reviews from two samples of mHealth apps (my fitness, medisafe) from the Google Play store users. For each mHealth app, we performed the following steps: • The reviews are divided into two groups, before starting the COVID-19 (reviews’ submission date before 15 Feb 2019) and during the COVID-19 (reviews’ submission date starts from 16 Feb 2019 till Dec 2020). For each period, the Latent Dirichlet Allocation (LDA) topic model was used to identify the different clusters of reviews based on similar topics of review The topics before and during COVID-19 are compared, and the significant difference in frequency and severity of similar topics are identified. RESULTS: We successfully scraped, filtered, processed, and identified health-related topics in both qualitative and quantitative approaches. The results demonstrated the similarity between topics before and during the COVID-19.

Keywords: natural language processing (NLP), topic modeling, mHealth, COVID-19, software engineering, telemedicine, health risks

Procedia PDF Downloads 130
1463 The Social Area Disclosure to Reduce Conflicts between Community and the State: A Case of Mahakan Fortress, Bangkok

Authors: Saowapa Phaithayawat

Abstract:

The purposes of this study are 1) to study the over 20-year attempt of Mahakan fort community to negotiate with Bangkok Metropolitan Administration (BMA) to remain in their residential area belonging to the state, and 2) to apply the new social and cultural dimension between the state and the community as an alternative for local participation in keeping their residential area. This is a qualitative research, and the findings reveal that the community claimed their ancestors’ right as owners of this piece of land for over 200 years. The community, therefore, requested to take part in the preservation of land, culture and local intellect and the area management in terms of being a learning resource on the cultural road in Rattanakosin Island. However, BMA imposed the law concerning the community area relocation in Rattanakosin Island. The result of law enforcement led to the failure of the area relocation, and the hard hit on physical structure of the area including the overall deterioration of the cultural road renovated in the year 1982, the 200 years’ celebration of Bangkok. The enforcement of law by the state required the move of the community, and the landscape improvement based on the capital city plan. However, this enforcement resulted in the unending conflicts between the community and the state, and the solution of this problem was unclear. At the same time the community has spent a long time opposing the state’s action, and preparing themselves by administrating the community behind Mahakan fortress with community administrative committee under the suggestion of external organization by registering all community members, providing funds for community administration. At the meantime the state lacked the continuation of the enforcement due to political problem and BMA’s administration problem. It is, therefore, suggested that an alternative solution to this problem lie at the negotiation between the state and the community with the purpose of the collaboration between the two to develop the area under the protective law of each side.

Keywords: Pom-Mahakan community, reduction of conflicts, social area disclosure, residential area

Procedia PDF Downloads 314
1462 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity

Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink

Abstract:

The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.

Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction

Procedia PDF Downloads 313
1461 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities

Authors: Nazli Hardy

Abstract:

Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.

Keywords: Internet of Things (IoT), authentication, protocols, survey

Procedia PDF Downloads 299
1460 The Impact of Iso 9001 Certification on Brazilian Firms’ Performance: Insights from Multiple Case Studies

Authors: Matheus Borges Carneiro, Fabiane Leticia Lizarelli, José Carlos De Toledo

Abstract:

The evolution of quality management by companies was strongly enabled by, among others, ISO 9001 certification, which is considered a crucial requirement for several customers. Likewise, performance measurement provides useful insights for companies to identify the reflection of their decision-making process on their improvement. One of the most used performance measurement models is the balanced scorecard (BSC), which uses four perspectives to address a firm’s performance: financial, internal process, customer satisfaction, and learning and growth. Studies related to ISO 9001 and business performance have mostly adopted a quantitative approach to identify the standard’s causal effect on a firm’s performance. However, to verify how this influence may occur, an in-depth analysis within a qualitative approach is required. Therefore, this paper aims to verify the impact of ISO 9001:2015 on Brazilian firms’ performance based on the balanced scorecard perspective. Hence, nine certified companies located in the Southeast region of Brazil were studied through a multiple case study approach. Within this study, it was possible to identify the positive impact of ISO 9001 on firms’ overall performance, and four Critical Success Factors (CSFs) were identified as relevant on the linkage among ISO 9001 and firms’ performance: employee involvement, top management, process management, and customer focus. Due to the COVID-19 pandemic, the number of interviews was limited to the quality manager specialist, and the sample was limited since several companies were closed during the period of the study. This study presents an in-depth analysis of how the relationship between ISO 9001 certification and firms’ performance in a developing country is.

Keywords: balanced scorecard, Brazilian firms’ performance, critical success factors, ISO 9001 certification, performance measurement

Procedia PDF Downloads 198
1459 'I'm in a Very Safe Place': Webcam Sex Workers in Aotearoa, New Zealand and Their Perceptions of Danger and Risk

Authors: Madeline V. Henry

Abstract:

Sex work is a contested subject in academia. Many authors now argue that the practice should be recognized as a legitimate and rationally chosen form of labor, and that decriminalization is necessary to ensure the safety of sex workers and reduce their stigmatization. However, a prevailing argument remains that the work is inherently violent and oppressive and that all sex workers are directly or indirectly coerced into participating in the industry. This argument has been complicated by the recent proliferation of computer-mediated technologies that allow people to conduct sex work without the need to be physically co-present with customers or pimps. One example of this is the practice of ‘camming’, wherein ‘webcam models’ stream themselves stripping and/or performing autoerotic stimulation in an online chat-room for payment. In this presentation, interviews with eight ‘camgirls’ (aged 22-34) will be discussed. Their talk has been analyzed using Foucauldian discourse analysis, focusing on common discursive threads in relation to the work and their subjectivities. It was found that the participants demonstrated appreciation for the lack of physical danger they were in, but emphasized the unique and significant dangers of online-based sex work (their images and videos being recorded and shared without their consent, for example). Participants also argued that their largest concerns were based around stigma, which they claimed remained prevalent despite the decriminalized legal model in Aotearoa/New Zealand (which has been in place for over 14 years). Overall, this project seeks to challenge commonplace academic approaches to sex work, adding further research to support sex workers’ rights and highlighting new issues to consider in a digital environment.

Keywords: camming, sex work, stigma, risk

Procedia PDF Downloads 155
1458 Fraud in the Higher Educational Institutions in Assam, India: Issues and Challenges

Authors: Kalidas Sarma

Abstract:

Fraud is a social problem changing with social change and it has a regional and global impact. Introduction of private domain in higher education along with public institutions has led to commercialization of higher education which encourages unprecedented mushrooming of private institutions resulting in fraudulent activities in higher educational institutions in Assam, India. Presently, fraud has been noticed in in-service promotion, fake entry qualification by teachers in different levels of work-place by using fake master degrees, master of philosophy and doctor of philosophy degree certificates. The aim and objective of the study are to identify grey areas in maintenance of quality in higher educational institutions in Assam and also to draw the contour for planning and implementation. This study is based on both primary and secondary data collected through questionnaire and seeking information through Right to Information Act 2005. In Assam, there are 301 undergraduate and graduate colleges distributed in 27 (Twenty seven) administrative districts with 11000 (Eleven thousand) college teachers. Total 421 (Four hundred twenty one) college teachers from the 14 respondent colleges have been taken for analysis. Data collected has been analyzed by using 'Hypertext Pre-processor' (PhP) application with My Sequel Structure Query Language (MySQL) and Google Map Application Programming Interface (APIs). Graph has been generated by using open source tool Chart.js. Spatial distribution maps have been generated with the help of geo-references of the colleges. The result shows: (i) the violation of University Grants Commission's (UGCs) Regulation for the awards of M. Phil/Ph.D. clearly exhibits. (ii) There is a gap between apex regulatory bodies of higher education at national and as well as state level to check fraud. (iii) Mala fide 'No Objection Certificate' (NOC) issued by the Government of Assam have played pivotal role in the occurrence of fraudulent practices in higher educational institutions of Assam. (iv) Violation of verdict of the Hon'ble Supreme Court of India regarding territorial jurisdiction of Universities for the awards of Ph.D. and M. Phil degrees in distance mode/study centre is also a responsible factor for the spread of these academic frauds in Assam and other states. The challenges and mitigation of these issues have been discussed.

Keywords: Assam, fraud, higher education, mitigation

Procedia PDF Downloads 167
1457 Effects of the Affordable Care Act On Preventive Care Disparities

Authors: Cagdas Agirdas

Abstract:

Background: The Affordable Care Act (ACA) requires non-grandfathered private insurance plans, starting with plan years on or after September 23rd, 2010, to provide certain preventive care services without any cost sharing in the form of deductibles, copayments or co-insurance. This requirement may affect racial and ethnic disparities in preventive care as it provides the largest copay reduction in preventive care. Objectives: We ask whether the ACA’s free preventive care benefits are associated with a reduction in racial and ethnic disparities in the utilization of four preventive services: cholesterol screenings, colonoscopies, mammograms, and pap smears. Methods: We use a data set of over 6,000 individuals from the 2009, 2010, and 2013 Medical Expenditure Panel Surveys (MEPS). We restrict our data set only to individuals who are old enough to be eligible for each preventive service. Our difference-in-differences logistic regression model classifies privately-insured Hispanics, African Americans, and Asians as the treatment groups and 2013 as the after-policy year. Our control group consists of non-Hispanic whites on Medicaid as this program already covered preventive care services for free or at a low cost before the ACA. Results: After controlling for income, education, marital status, preferred interview language, self-reported health status, employment, having a usual source of care, age and gender, we find that the ACA is associated with increases in the probability of the median, privately-insured Hispanic person to get a colonoscopy by 3.6% and a mammogram by 3.1%, compared to a non-Hispanic white person on Medicaid. Similarly, we find that the median, privately-insured African American person’s probability of receiving these two preventive services improved by 2.3% and 2.4% compared to a non-Hispanic white person on Medicaid. We do not find any significant improvements for any racial or ethnic group for cholesterol screenings or pap smears. Furthermore, our results do not indicate any significant changes for Asians compared to non-Hispanic whites in utilizing the four preventive services. These reductions in racial/ethnic disparities are robust to reconfigurations of time periods, previous diagnosis, and residential status. Conclusions: Early effects of the ACA’s provision of free preventive care are significant for Hispanics and African Americans. Further research is needed for the later years as more individuals became aware of these benefits.

Keywords: preventive care, Affordable Care Act, cost sharing, racial disparities

Procedia PDF Downloads 153
1456 Gender Equality at Workplace in Iran - Strategies and Successes Against Systematic Bias

Authors: Leila Sadeghi

Abstract:

Gender equality is a critical concern in the workplace, particularly in Iran, where legal and social barriers contribute to significant disparities. This abstract presents a case study of Dahi Bondad Co., a company based in Tehran, Iran that recognized the urgency of addressing the gender gap within its organization. Through a comprehensive investigation, the company identified issues related to biased recruitment, pay disparities, promotion biases, internal barriers, and everyday boundaries. This abstract highlights the strategies implemented by Dahi Bondad Co. to combat these challenges and foster gender equality. The company revised its recruitment policies, eliminated gender-specific language in job advertisements, and implemented blind resume screening to ensure equal opportunities for all applicants. Comprehensive pay equity analyses were conducted, leading to salary adjustments based on qualifications and experience to rectify pay disparities. Clear and transparent promotion criteria were established, and training programs were provided to decision-makers to raise awareness about unconscious biases. Additionally, mentorship and coaching programs were introduced to support female employees in overcoming self-limiting beliefs and imposter syndrome. At the same time, practical workshops and gamification techniques were employed to boost confidence and encourage women to step out of their comfort zones. The company also recognized the importance of dress codes and allowed optional hijab-wearing, respecting local traditions while promoting individual freedom. As a result of these strategies, Dahi Bondad Co. successfully fostered a more equitable and empowering work environment, leading to increased job satisfaction for both male and female employees within a short timeframe. This case study serves as an example of practical approaches that human resource managers can adopt to address gender inequality in the workplace, providing valuable insights for organizations seeking to promote gender equality in similar contexts.

Keywords: gender equality, human resource strategies, legal barrier, social barrier, successful result, successful strategies, workplace in Iran

Procedia PDF Downloads 67
1455 Thermal Vacuum Chamber Test Result for CubeSat Transmitter

Authors: Fitri D. Jaswar, Tharek A. Rahman, Yasser A. Ahmad

Abstract:

CubeSat in low earth orbit (LEO) mainly uses ultra high frequency (UHF) transmitter with fixed radio frequency (RF) output power to download the telemetry and the payload data. The transmitter consumes large amount of electrical energy during the transmission considering the limited satellite size of a CubeSat. A transmitter with power control ability is designed to achieve optimize the signal to noise ratio (SNR) and efficient power consumption. In this paper, the thermal vacuum chamber (TVAC) test is performed to validate the performance of the UHF band transmitter with power control capability. The TVAC is used to simulate the satellite condition in the outer space environment. The TVAC test was conducted at the Laboratory of Spacecraft Environment Interaction Engineering, Kyushu Institute of Technology, Japan. The TVAC test used 4 thermal cycles starting from +60°C to -20°C for the temperature setting. The pressure condition inside chamber was less than 10-5Pa. During the test, the UHF transmitter is integrated in a CubeSat configuration with other CubeSat subsystem such as on board computer (OBC), power module, and satellite structure. The system is validated and verified through its performance in terms of its frequency stability and the RF output power. The UHF band transmitter output power is tested from 0.5W to 2W according the satellite mode of operations and the satellite power limitations. The frequency stability is measured and the performance obtained is less than 2 ppm in the tested operating temperature range. The test demonstrates the RF output power is adjustable in a thermal vacuum condition.

Keywords: communication system, CubeSat, SNR, UHF transmitter

Procedia PDF Downloads 264
1454 The Impact of Artificial Intelligence on Pharmacy and Pharmacology

Authors: Mamdouh Milad Adly Morkos

Abstract:

Despite having the greatest rates of mortality and morbidity in the world, low- and middle-income (LMIC) nations trail high-income nations in terms of the number of clinical trials, the number of qualified researchers, and the amount of research information specific to their people. Health inequities and the use of precision medicine may be hampered by a lack of local genomic data, clinical pharmacology and pharmacometrics competence, and training opportunities. These issues can be solved by carrying out health care infrastructure development, which includes data gathering and well-designed clinical pharmacology training in LMICs. It will be advantageous if there is international cooperation focused at enhancing education and infrastructure and promoting locally motivated clinical trials and research. This paper outlines various instances where clinical pharmacology knowledge could be put to use, including pharmacogenomic opportunities that could lead to better clinical guideline recommendations. Examples of how clinical pharmacology training can be successfully implemented in LMICs are also provided, including clinical pharmacology and pharmacometrics training programmes in Africa and a Tanzanian researcher's personal experience while on a training sabbatical in the United States. These training initiatives will profit from advocacy for clinical pharmacologists' employment prospects and career development pathways, which are gradually becoming acknowledged and established in LMICs. The advancement of training and research infrastructure to increase clinical pharmacologists' knowledge in LMICs would be extremely beneficial because they have a significant role to play in global health

Keywords: electromagnetic solar system, nano-material, nano pharmacology, pharmacovigilance, quantum theoryclinical simulation, education, pharmacology, simulation, virtual learning low- and middle-income, clinical pharmacology, pharmacometrics, career development pathways

Procedia PDF Downloads 81
1453 Architectural Adaptation for Road Humps Detection in Adverse Light Scenario

Authors: Padmini S. Navalgund, Manasi Naik, Ujwala Patil

Abstract:

Road hump is a semi-cylindrical elevation on the road made across specific locations of the road. The vehicle needs to maneuver the hump by reducing the speed to avoid car damage and pass over the road hump safely. Road Humps on road surfaces, if identified in advance, help to maintain the security and stability of vehicles, especially in adverse visibility conditions, viz. night scenarios. We have proposed a deep learning architecture adaptation by implementing the MISH activation function and developing a new classification loss function called "Effective Focal Loss" for Indian road humps detection in adverse light scenarios. We captured images comprising of marked and unmarked road humps from two different types of cameras across South India to build a heterogeneous dataset. A heterogeneous dataset enabled the algorithm to train and improve the accuracy of detection. The images were pre-processed, annotated for two classes viz, marked hump and unmarked hump. The dataset from these images was used to train the single-stage object detection algorithm. We utilised an algorithm to synthetically generate reduced visible road humps scenarios. We observed that our proposed framework effectively detected the marked and unmarked hump in the images in clear and ad-verse light environments. This architectural adaptation sets up an option for early detection of Indian road humps in reduced visibility conditions, thereby enhancing the autonomous driving technology to handle a wider range of real-world scenarios.

Keywords: Indian road hump, reduced visibility condition, low light condition, adverse light condition, marked hump, unmarked hump, YOLOv9

Procedia PDF Downloads 24
1452 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 86
1451 Infrared Thermography as an Informative Tool in Energy Audit and Software Modelling of Historic Buildings: A Case Study of the Sheffield Cathedral

Authors: Ademuyiwa Agbonyin, Stamatis Zoras, Mohammad Zandi

Abstract:

This paper investigates the extent to which building energy modelling can be informed based on preliminary information provided by infrared thermography using a thermal imaging camera in a walkthrough audit. The case-study building is the Sheffield Cathedral, built in the early 1400s. Based on an informative qualitative report generated from the thermal images taken at the site, the regions showing significant heat loss are input into a computer model of the cathedral within the integrated environmental solution (IES) virtual environment software which performs an energy simulation to determine quantitative heat losses through the building envelope. Building data such as material thermal properties and building plans are provided by the architects, Thomas Ford and Partners Ltd. The results of the modelling revealed the portions of the building with the highest heat loss and these aligned with those suggested by the thermal camera. Retrofit options for the building are also considered, however, may not see implementation due to a desire to conserve the architectural heritage of the building. Results show that thermal imaging in a walk-through audit serves as a useful guide for the energy modelling process. Hand calculations were also performed to serve as a 'control' to estimate losses, providing a second set of data points of comparison.

Keywords: historic buildings, energy retrofit, thermal comfort, software modelling, energy modelling

Procedia PDF Downloads 170
1450 Anatomy of the Challenges, Problems and Prospects of Polytechnic Administration in North-Central Nigeria

Authors: A. O. Osabo

Abstract:

Polytechnic education is often described as the only sustainable academic institution that can propel massive industrial and technological growth and development in all sectors of the Nigerian economy. Because of its emphasis on science and technology, practical demonstration of skills and pivotal role in the training of low-and-high-cadre technologists and technocrats to man critical sectors of the economy, the administration of polytechnics needs to be run according to global best standards and practices in order to achieve their goals and objectives. Besides, the polytechnics need to be headed by seasoned and academically sound professionals to pursue the goals and objectives of the schools as centres of technology, learning and academic excellence. Over the years, however, polytechnics in Nigeria have suffered a wide myriad of administrative problems and challenges which have prevented them from achieving their basic goals and objectives. Apart from regulatory problems and challenges, some heads of polytechnics do not demonstrate leadership and management skills in bringing the desired innovations in the management of the polytechnics under them. These have resulted, in most cases, to the polytechnics not performing optimally in its mandate. This paper examines the administrative problems, challenges and prospects of polytechnics education in north-central Nigeria. Using a total of 97 questionnaires consisting of semi-structured interviews of yes-or-no questions shared among staff and students of the selected polytechnics and a descriptive statistical method of analysis, the study found that the inability of the polytechnics to meet their goals and objectives is caused by administrative and organizational problems and challenges, bordering on funding, accreditation, manpower, corruption and maladministration, among others. The paper thus suggests that the leadership of the polytechnics must rise up to the demands of the time in order to deal with the administrative problems and challenges affecting them and fulfill the goals and objectives for which the schools were established.

Keywords: education, administration, polytechnic, accreditation, Nigerian

Procedia PDF Downloads 264
1449 The Relations Between Hans Kelsen’s Concept of Law and the Theory of Democracy

Authors: Monika Zalewska

Abstract:

Hans Kelsen was a versatile legal thinker whose achievements in the fields of legal theory, international law, and the theory of democracy are remarkable. All of the fields tackled by Kelsen are regarded as part of his “pure theory of law.” While the link between international law and Kelsen’s pure theory of law is apparent, the same cannot be said about the link between the theory of democracy and his pure theory of law. On the contrary, the general thinking concerning Kelsen’s thought is that it can be used to legitimize authoritarian regimes. The aim of this presentation is to address this concern by identifying the common ground between Kelsen’s pure theory of law and his theory of democracy and to show that they are compatible in a way that his pure theory of law and authoritarianism cannot be. The conceptual analysis of the purity of Kelsen’s theory and his goal of creating ideology-free legal science hints at how Kelsen’s pure theory of law and the theory of democracy are brought together. The presentation will first demonstrate that these two conceptions have common underlying values and meta-ethical convictions. Both are founded on relativism and a rational worldview, and the aim of both is peaceful co-existence. Second, it will be demonstrated that the separation of law and morality provides the maximum space for deliberation within democratic processes. The conclusion of this analysis is that striking similarities exist between Kelsen’s legal theory and his theory of democracy. These similarities are grounded in the Enlightenment tradition and its values, including rationality, a scientific worldview, tolerance, and equality. This observation supports the claim that, for Kelsen, legal positivism and the theory of democracy are not two separate theories but rather stem from the same set of values and from Kelsen’s relativistic worldview. Furthermore, three main issues determine Kelsen’s orientation toward a positivistic and democratic outlook. The first, which is associated with personality type, is the distinction between absolutism and relativism. The second, which is associated with the values that Kelsen favors in the social order, is peace. The third is legality, which creates the necessary condition for democracy to thrive and reveals that democracy is capable of fulfilling Kelsen’s ideal of law at its fullest. The first two categories exist in the background of Kelsen’s pure theory of law, while the latter is an inherent part of Kelsen’s concept of law. The analysis of the text concerning natural law doctrine and democracy indicates that behind the technical language of Kelsen’s pure theory of law is a strong concern with the trends that appeared after World War I. Despite his rigorous scientific mind, Kelsen was deeply humanistic. He tried to create a powerful intellectual weapon to provide strong arguments for peaceful coexistence and a rational outlook in Europe. The analysis provided by this presentation facilitates a broad theoretical, philosophical, and political understanding of Kelsen’s perspectives and, consequently, urges a strong endorsement of Kelsen’s approach to constitutional democracy.

Keywords: hans kelsen, democracy, legal positivism, pure theory of law

Procedia PDF Downloads 110
1448 For a Poetic Clinic: Experimentations at Risk on the Images in Performances

Authors: Juliana Bom-Tempo

Abstract:

The proposed composition occurs between images, performances, clinics and philosophies. For this enterprise we depart for what is not known beforehand, so with a question as a compass: "would it be in the creation, production and implementation of images in a performance a 'when' for the event of a poetic clinic?” In light of this, there are, in order to think a 'when' of the event of a poetic clinic, images in performances created, produced and executed in partnerships with the author of this text. Faced with this composition, we built four indicators to find spatiotemporal coordinates that would spot that "when", namely: risk zones; the mobilizations of the signs; the figuring of the flesh and an education of the affections. We dealt with the images in performances; Crútero; Flesh; Karyogamy and the risk of abortion; Egg white; Egg-mouth; Islands, threads, words ... germs; Egg-Mouth-Debris, taken as case studies, by engendering risks areas to promote individuations, which never actualize thoroughly, thus always something of pre-individual and also individuating a environment; by mobilizing the signs territorialized by the ordinary, causing them to vary the language and the words of order dictated by the everyday in other compositions of sense, other machinations; by generating a figure of flesh, disarranging the bodies, isolating them in the production of a ground force that causes the body to leak out and undo the functionalities of the organs; and, finally, by producing an education of affections, by placing the perceptions in becoming and disconnecting the visible in the production of small deserts that call for the creation of a people yet to come. The performance is processed as a problematizing of the images fixed by the ordinary, producing gestures that precipitate the individuation of images in performance, strange to the configurations that gather bodies and spaces in what we call common. Lawrence proposes to think of "people" who continually use umbrellas to protect themselves from chaos. These have the function of wrapping up the chaos in visions that create houses, forms and stabilities; they paint a sky at the bottom of the umbrella, where people march and die. A chaos, where people live and wither. Pierce the umbrella for a desire of chaos; a poet puts himself as an enemy of the convention, to be able to have an image of chaos and a little sun that burns his skin. The images in performances presented, thereby, were moving in search for the power of producing a spatio-temporal "when" putting the territories in risk areas, mobilizing the signs that format the day-to-day, opening the bodies to a disorganization and the production of an education of affections for the event of a poetic clinic.

Keywords: Experimentations , Images in Performances, Poetic Clinic, Risk

Procedia PDF Downloads 114
1447 Automatic Detection and Update of Region of Interest in Vehicular Traffic Surveillance Videos

Authors: Naydelis Brito Suárez, Deni Librado Torres Román, Fernando Hermosillo Reynoso

Abstract:

Automatic detection and generation of a dynamic ROI (Region of Interest) in vehicle traffic surveillance videos based on a static camera in Intelligent Transportation Systems is challenging for computer vision-based systems. The dynamic ROI, being a changing ROI, should capture any other moving object located outside of a static ROI. In this work, the video is represented by a Tensor model composed of a Background and a Foreground Tensor, which contains all moving vehicles or objects. The values of each pixel over a time interval are represented by time series, and some pixel rows were selected. This paper proposes a pixel entropy-based algorithm for automatic detection and generation of a dynamic ROI in traffic videos under the assumption of two types of theoretical pixel entropy behaviors: (1) a pixel located at the road shows a high entropy value due to disturbances in this zone by vehicle traffic, (2) a pixel located outside the road shows a relatively low entropy value. To study the statistical behavior of the selected pixels, detecting the entropy changes and consequently moving objects, Shannon, Tsallis, and Approximate entropies were employed. Although Tsallis entropy achieved very high results in real-time, Approximate entropy showed results slightly better but in greater time.

Keywords: convex hull, dynamic ROI detection, pixel entropy, time series, moving objects

Procedia PDF Downloads 74
1446 A Spatial Approach to Model Mortality Rates

Authors: Yin-Yee Leong, Jack C. Yue, Hsin-Chung Wang

Abstract:

Human longevity has been experiencing its largest increase since the end of World War II, and modeling the mortality rates is therefore often the focus of many studies. Among all mortality models, the Lee–Carter model is the most popular approach since it is fairly easy to use and has good accuracy in predicting mortality rates (e.g., for Japan and the USA). However, empirical studies from several countries have shown that the age parameters of the Lee–Carter model are not constant in time. Many modifications of the Lee–Carter model have been proposed to deal with this problem, including adding an extra cohort effect and adding another period effect. In this study, we propose a spatial modification and use clusters to explain why the age parameters of the Lee–Carter model are not constant. In spatial analysis, clusters are areas with unusually high or low mortality rates than their neighbors, where the “location” of mortality rates is measured by age and time, that is, a 2-dimensional coordinate. We use a popular cluster detection method—Spatial scan statistics, a local statistical test based on the likelihood ratio test to evaluate where there are locations with mortality rates that cannot be described well by the Lee–Carter model. We first use computer simulation to demonstrate that the cluster effect is a possible source causing the problem of the age parameters not being constant. Next, we show that adding the cluster effect can solve the non-constant problem. We also apply the proposed approach to mortality data from Japan, France, the USA, and Taiwan. The empirical results show that our approach has better-fitting results and smaller mean absolute percentage errors than the Lee–Carter model.

Keywords: mortality improvement, Lee–Carter model, spatial statistics, cluster detection

Procedia PDF Downloads 171
1445 The Grammar of the Content Plane as a Style Marker in Forensic Authorship Attribution

Authors: Dayane de Almeida

Abstract:

This work aims at presenting a study that demonstrates the usability of categories of analysis from Discourse Semiotics – also known as Greimassian Semiotics in authorship cases in forensic contexts. It is necessary to know if the categories examined in semiotic analysis (the ‘grammar’ of the content plane) can distinguish authors. Thus, a study with 4 sets of texts from a corpus of ‘not on demand’ written samples (those texts differ in formality degree, purpose, addressees, themes, etc.) was performed. Each author contributed with 20 texts, separated into 2 groups of 10 (Author1A, Author1B, and so on). The hypothesis was that texts from a single author were semiotically more similar to each other than texts from different authors. The assumptions and issues that led to this idea are as follows: -The features analyzed in authorship studies mostly relate to the expression plane: they are manifested on the ‘surface’ of texts. If language is both expression and content, content would also have to be considered for more accurate results. Style is present in both planes. -Semiotics postulates the content plane is structured in a ‘grammar’ that underlies expression, and that presents different levels of abstraction. This ‘grammar’ would be a style marker. -Sociolinguistics demonstrates intra-speaker variation: an individual employs different linguistic uses in different situations. Then, how to determine if someone is the author of several texts, distinct in nature (as it is the case in most forensic sets), when it is known intra-speaker variation is dependent on so many factors?-The idea is that the more abstract the level in the content plane, the lower the intra-speaker variation, because there will be a greater chance for the author to choose the same thing. If two authors recurrently chose the same options, differently from one another, it means each one’s option has discriminatory power. -Size is another issue for various attribution methods. Since most texts in real forensic settings are short, methods relying only on the expression plane tend to fail. The analysis of the content plane as proposed by greimassian semiotics would be less size-dependable. -The semiotic analysis was performed using the software Corpus Tool, generating tags to allow the counting of data. Then, similarities and differences were quantitatively measured, through the application of the Jaccard coefficient (a statistical measure that compares the similarities and differences between samples). The results showed the hypothesis was confirmed and, hence, the grammatical categories of the content plane may successfully be used in questioned authorship scenarios.

Keywords: authorship attribution, content plane, forensic linguistics, greimassian semiotics, intraspeaker variation, style

Procedia PDF Downloads 242
1444 Analysis of a Coupled Hydro-Sedimentological Numerical Model for the Western Tombolo of Giens

Authors: Yves Lacroix, Van Van Than, Didier Léandri, Pierre Liardet

Abstract:

The western Tombolo of the Giens peninsula in southern France, known as Almanarre beach, is subject to coastal erosion. We are trying to use computer simulation in order to propose solutions to stop this erosion. Our aim was first to determine the main factors for this erosion and successfully apply a coupled hydro-sedimentological numerical model based on observations and measurements that have been performed on the site for decades. We have gathered all available information and data about waves, winds, currents, tides, bathymetry, coastal line, and sediments concerning the site. These have been divided into two sets: one devoted to calibrating a numerical model using Mike 21 software, the other to serve as a reference in order to numerically compare the present situation to what it could be if we implemented different types of underwater constructions. This paper presents the first part of the study: selecting and melting different sources into a coherent data basis, identifying the main erosion factors, and calibrating the coupled software model against the selected reference period. Our results bring calibration of the numerical model with good fitting coefficients. They also show that the winter South-Western storm events conjugated to depressive weather conditions constitute a major factor of erosion, mainly due to wave impact in the northern part of the Almanarre beach. Together, current and wind impact is shown negligible.

Keywords: Almanarre beach, coastal erosion, hydro-sedimentological, numerical model

Procedia PDF Downloads 376