Search results for: GCV Tool
1458 Rethinking the Air Quality Health Index: Harmonizing Health Protection and Climate Mitigation
Authors: Kimberly Tasha Jiayi Tang, Changqing Lin, Zhe Wang, Tze-Wai Wong, Md. Shakhaoat Hossain, Jian Yu, Alexis Lau
Abstract:
Hong Kong has practiced a risk-based Air Quality Health Index (AQHI) system that sums hospitalization risks associated with short-term exposure to air pollu-tants. As an air pollution risk communication tool, it informs the public about the current air quality, anchoring around the World Health Organization's (WHO) 2005 Air Quality Guidelines (AQGs). Given the WHO's recent update in 2021, assessing how Hong Kong’s air quality risk communication can be en-hanced using these updated guidelines is essential. Hong Kong’s AQHI is lim-ited by solely focusing on short-term health risks, which could lead the public to underestimate cumulative health impacts. Therefore, we propose the intro-duction of a composite AQHI that reports both long-term and short-term health risks. Additionally, the WHO interim targets will be considered as anchor points for various health risk categories. Furthermore, with the increasing ozone levels in Hong Kong and Southern China due to improved NOx mitigation measures, it has been a challenging task in balancing health protection against climate mitigation. However, our findings present a promising outlook. Despite the rise in ozone levels, the combined health risks in Hong Kong and Guang-dong have seen a decline, largely due to reductions in NO2 and PM concentra-tions, both having significant health implications. By shifting from a concentra-tion-based approach to a health risk-based system like the AQHI, our study highlights the prospective of harmonizing health protection and climate mitiga-tion goals. This health-focused framework suggests that rigorous NOx controls can effective-ly serve both objectives in parallel.Keywords: air quality management, air quality health index, health risk management, air pollution
Procedia PDF Downloads 711457 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge
Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi
Abstract:
Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring
Procedia PDF Downloads 2071456 An Evaluation on the Effectiveness of a 3D Printed Composite Compression Mold
Authors: Peng Hao Wang, Garam Kim, Ronald Sterkenburg
Abstract:
The applications of composite materials within the aviation industry has been increasing at a rapid pace. However, the growing applications of composite materials have also led to growing demand for more tooling to support its manufacturing processes. Tooling and tooling maintenance represents a large portion of the composite manufacturing process and cost. Therefore, the industry’s adaptability to new techniques for fabricating high quality tools quickly and inexpensively will play a crucial role in composite material’s growing popularity in the aviation industry. One popular tool fabrication technique currently being developed involves additive manufacturing such as 3D printing. Although additive manufacturing and 3D printing are not entirely new concepts, the technique has been gaining popularity due to its ability to quickly fabricate components, maintain low material waste, and low cost. In this study, a team of Purdue University School of Aviation and Transportation Technology (SATT) faculty and students investigated the effectiveness of a 3D printed composite compression mold. A 3D printed composite compression mold was fabricated by 3D scanning a steel valve cover of an aircraft reciprocating engine. The 3D printed composite compression mold was used to fabricate carbon fiber versions of the aircraft reciprocating engine valve cover. The 3D printed composite compression mold was evaluated for its performance, durability, and dimensional stability while the fabricated carbon fiber valve covers were evaluated for its accuracy and quality. The results and data gathered from this study will determine the effectiveness of the 3D printed composite compression mold in a mass production environment and provide valuable information for future understanding, improvements, and design considerations of 3D printed composite molds.Keywords: additive manufacturing, carbon fiber, composite tooling, molds
Procedia PDF Downloads 1971455 Assimilating Remote Sensing Data Into Crop Models: A Global Systematic Review
Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam
Abstract:
Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.Keywords: crop models, remote sensing, data assimilation, crop yield estimation
Procedia PDF Downloads 1291454 Assimilating Remote Sensing Data into Crop Models: A Global Systematic Review
Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam
Abstract:
Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.Keywords: crop models, remote sensing, data assimilation, crop yield estimation
Procedia PDF Downloads 801453 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models
Authors: R. Hellmuth
Abstract:
The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.Keywords: building information modeling, digital factory model, factory planning, maintenance
Procedia PDF Downloads 1091452 Exploring Community Benefits Frameworks as a Tool for Addressing Intersections of Equity and the Green Economy in Toronto's Urban Development
Authors: Cheryl Teelucksingh
Abstract:
Toronto is in the midst of an urban development and infrastructure boom. Population growth and concerns about urban sprawl and carbon emissions have led to pressure on the municipal and the provincial governments to re-think urban development. Toronto’s approach to climate change mitigation and adaptation has positioning of the emerging green economy as part of the solution. However, the emerging green economy many not benefit all Torontonians in terms of jobs, improved infrastructure, and enhanced quality of life. Community benefits agreements (CBAs) are comprehensive, negotiated commitments, in which founders and builders of major infrastructure projects formally agree to work with community interest groups based in the community where the development is taking place, toward mutually beneficial environmental and labor market outcomes. When community groups are equitably represented in the process, they stand not only to benefit from the jobs created from the project itself, but also from the longer-term community benefits related to the quality of the completed work, including advocating for communities’ environmental needs. It is believed that green employment initiatives in Toronto should give greater consideration to best practices learned from community benefits agreements. Drawing on the findings of a funded qualitative study in Toronto (Canada), “The Green Gap: Toward Inclusivity in Toronto’s Green Economy” (2013-2016), this paper examines the emergent CBA in Toronto in relation to the development of a light rail transit project. Theoretical and empirical consideration will be given to the research gaps around CBAs, the role of various stakeholders, and discuss the potential for CBAs to gain traction in the Toronto’s urban development context. The narratives of various stakeholders across Toronto’s green economy will be interwoven with a discussion of the CBA model in Toronto and other jurisdictions.Keywords: green economy in Toronto, equity, community benefits agreements, environmental justice, community sustainability
Procedia PDF Downloads 3411451 Breeding for Hygienic Behavior in Honey Bees
Authors: Michael Eickermann, Juergen Junk
Abstract:
The Western honey (Apis mellifera) is threatened by a number of parasites, especially the devastating Varroa mite (Varroa destructor) is responsible for a high level of mortality over winter, e.g., in Europe and USA. While the use of synthetic pesticides or organic acids has been preferred so far to control this parasite, breeding strategies for less susceptible honey bees are in early stages. Hygienic behavior can be an important tool for controlling Varroa destructor. Worker bees with a high level of this behavior are able to detect infested brood in the cells under the wax lid during pupation and remove them out of the hive. The underlying processes of this behavior are only partly investigated, but it is for sure that hygienic behavior is heritable and therefore, can be integrated into commercial breeding lines. In a first step, breeding lines with a high level of phenotypic hygienic behavior have been identified by using a bioassay for accurate assessment of this trait in a long-term national breeding program in Luxembourg since 2015. Based on the artificial infestation of nucleus colonies with 150 phoretic Varroa destructor mites, the level of phenotypic hygienic behavior was detected by counting the number of mites in all stages, twelve days after infestation. A nucleus with a high level of hygienic behavior was overwintered and used for breeding activities in the following years. Artificial insemination was used to combine different breeding lines. Buckfast lines, as well as Carnica lines, were used. While Carnica lines offered only a low increase of hygienic behavior up to maximum 62.5%, Buckfast lines performed much better with mean levels of more than 87.5%. Some mating ends up with a level of 100%. But even with a level of 82.5% Varroa mites are not able to reproduce in the colony anymore. In a final step, a nucleus with a high level of hygienic behavior were build up to full colonies and located at two places in Luxembourg to build up a drone congregation area. Local beekeepers can bring their nucleus to this location for mating the queens with drones offering a high level of hygienic behavior.Keywords: agiculture, artificial insemination, honey bee, varroa destructor
Procedia PDF Downloads 1341450 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research
Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez
Abstract:
Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.Keywords: action research, information security, information technology, methodological design, process virtualization, risk management
Procedia PDF Downloads 1641449 An Educational Program Based on Health Belief Model to Prevent of Non-alcoholic Fatty Liver Disease Among Iranian Women
Authors: Arezoo Fallahi
Abstract:
Background and purpose: Non-alcoholic fatty liver is one of the most common liver disorders, which, as the most important cause of death from liver disease, has unpleasant consequences and complications. The aim of this study was to investigate the effect of an educational intervention based on a health belief model to prevent non-alcoholic fatty liver among women. Materials and Methods: This experimental study was performed among 110 women referring to comprehensive health service centers in Malayer City, west of Iran, in 2023. Using the available sampling method, 110 Participants were divided into experimental and control groups. The data collection tool included demographic characteristics and a questionnaire based on the health belief model. In The experimental group, three one-hour training sessions were conducted in the form of pamphlets, lectures and group discussions. Data were analyzed using SPSS software version 21, by correlation tests, paired t-tests independent t-tests. Results: The mean age of participants was 38.07±6.28 years, and Most of the participants were middle-aged, married, housewives with academic education, middle-income and overweight. After the educational intervention, the mean scores of the constructs include perceived sensitivity (p=0.01), perceived severity (p=0.01), perceived benefits (p=0.01), guidance for internal (p=0.01) and external action (p=0.01), and perceived self-efficacy (p=0.01) in the experimental group were significantly higher than the control group. The score of perceived barriers in the experimental group decreased after training. The perceived obstacles score in the test group decreased after the training (15.2 ± 3.9 v.s 11.2 ± 3.3, (p<0.01). Conclusion: The findings of the study showed that the design and implementation of educational programs based on the constructs of the health belief model can be effective in preventing women from developing higher levels of non-alcoholic fatty liver.Keywords: health, education, believe, behaviour
Procedia PDF Downloads 501448 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 161447 Static Application Security Testing Approach for Non-Standard Smart Contracts
Authors: Antonio Horta, Renato Marinho, Raimir Holanda
Abstract:
Considered as an evolution of the Blockchain, the Ethereum platform, besides allowing transactions of its cryptocurrency named Ether, it allows the programming of decentralised applications (DApps) and smart contracts. However, this functionality into blockchains has raised other types of threats, and the exploitation of smart contracts vulnerabilities has taken companies to experience big losses. This research intends to figure out the number of contracts that are under risk of being drained. Through a deep investigation, more than two hundred thousand smart contracts currently available in the Ethereum platform were scanned and estimated how much money is at risk. The experiment was based in a query run on Google Big Query in July 2022 and returned 50,707,133 contracts published on the Ethereum platform. After applying the filtering criteria, the experimentgot 430,584 smart contracts to download and analyse. The filtering criteria consisted of filtering out: ERC20 and ERC721 contracts, contracts without transactions, and contracts without balance. From this amount of 430,584 smart contracts selected, only 268,103 had source codes published on Etherscan, however, we discovered, using a hashing process, that there were contracts duplication. Removing the duplicated contracts, the process ended up with 20,417 source codes, which were analysed using the open source SAST tool smartbugswith oyente and securify algorithms. In the end, there was nearly $100,000 at risk of being drained from the potentially vulnerable smart contracts. It is important to note that the tools used in this study may generate false positives, which may interfere with the number of vulnerable contracts. To address this point, our next step in this research is to develop an application to test the contract in a parallel environment to verify the vulnerability. Finally, this study aims to alert users and companies about the risk on not properly creating and analysing their smart contracts before publishing them into the platform. As any other application, smart contracts are at risk of having vulnerabilities which, in this case, may result in direct financial losses.Keywords: blockchain, reentrancy, static application security testing, smart contracts
Procedia PDF Downloads 861446 The Impact of Covid-19 on Anxiety Levels in the General Population of the United States: An Exploratory Survey
Authors: Amro Matyori, Fatimah Sherbeny, Askal Ali, Olayiwola Popoola
Abstract:
Objectives: The study evaluated the impact of COVID-19 on anxiety levels in the general population in the United States. Methods: The study used an online questionnaire. It adopted the Generalized Anxiety Disorder Assessment (GAD-7) instrument. It is a self-administered scale with seven items used as a screening tool and severity measure for generalized anxiety disorder. The participants rated the frequency of anxiety symptoms in the last two weeks on a Likert scale, which ranges from 0-3. Then the item points are summed to provide the total score. Results: Thirty-two participants completed the questionnaire. Among them, 24 (83%) females and 5 (17%) males. The age range of 18-24-year-old represented the most respondents. Only one of the participants tested positive for the COVID-19, and 39% of them, one of their family members, friends, or colleagues, tested positive for the coronavirus. Moreover, 10% have lost a family member, a close friend, or a colleague because of COVID-19. Among the respondents, there were ten who scored approximately five points on the GAD-7 scale, which indicates mild anxiety. Furthermore, eight participants scored 10 to 14 points, which put them under the category of moderate anxiety, and one individual who was categorized under severe anxiety scored 15 points. Conclusions: It is identified that most of the respondents scored the points that put them under the mild anxiety category during the COVID-19 pandemic. It is also noticed that severe anxiety was the lowest among the participants, and people who tested positive and/or their family members, close friends, and colleagues were more likely to experience anxiety. Additionally, participants who lost friends or family members were also at high risk of anxiety. It is obvious the COVID-19 outcomes and too much thinking about the pandemic put people under stress which led to anxiety. Therefore, continuous assessment and monitoring of psychological outcomes during pandemics will help to establish early well-informed interventions.Keywords: anxiety and covid-19, covid-19 and mental health outcomes, influence of covid-19 on anxiety, population and covid-19 impact on mental health
Procedia PDF Downloads 2071445 Investigating Effective Factors on the Organizational Pathology of Knowledge Production in Islamic Azad University
Authors: Davoud Maleki, Neda Zamani
Abstract:
The purpose of this research was to investigate the factors affecting the organizational pathology of knowledge production in Islamic Azad University. The present research method is quantitative. It was a survey type and applied research in terms of its purpose. The statistical population of the present study included all full-time professors of the Islamic Azad Universities in the North, South, East, West and Central regions, including the Islamic Azad Universities of Sari, Isfahan, Kerman, Khorramabad and Shiraz, and their total number was 1389, based on the Cochran formula. 305 people were selected as the sample size by random sampling method. The research tool was a researcher-made questionnaire, whose validity was calculated from the professors' point of view and its reliability was calculated based on Cronbach's alpha and was 0.89. For data analysis, confirmatory factor analysis and structural equations were used with Smart3 Pls software. The findings showed that the variables of strategy, structure and process directly and the variable of strategy explained indirectly through the variables of structure and process 96.8% of the pathology of knowledge production. Also, structure 49.6% and process variable 58.4% explain the pathology of knowledge production. 38% of knowledge production changes related to the direct effect of strategy, 39% of knowledge production changes Related to the effect of structure, 32% of the changes in knowledge production are related to the direct effect of the process, 70.5% of the changes related to the structure are related to the direct effect of the strategy, 36.5% of the changes related to the process are related to the direct effect of the strategy, 46.3 Percentage of process variable changes It is related to the direct effect of the structure. According to the obtained results, it can be acknowledged that the pathology model of knowledge production in Islamic Azad University can be used as an effective model in the pathology of knowledge production and can improve the scientific level of knowledge producers.Keywords: pathology of knowledge production, strategic issues, process issues, Islamic Azad University
Procedia PDF Downloads 161444 Active Part of the Burnishing Tool Effect on the Physico-Geometric Aspect of the Superficial Layer of 100C6 and 16NC6 Steels
Authors: Tarek Litim, Ouahiba Taamallah
Abstract:
Burnishing is a mechanical surface treatment that combines several beneficial effects on the two steel grades studied. The application of burnishing to the ball or to the tip favors a better roughness compared to turning. In addition, it allows the consolidation of the surface layers through work hardening phenomena. The optimal effects are closely related to the treatment parameters and the active part of the device. With an improvement of 78% on the roughness, burnishing can be defined as a finishing operation in the machining range. With a 44% gain in consolidation rate, this treatment is an effective process for material consolidation. These effects are affected by several factors. The factors V, f, P, r, and i have the most significant effects on both roughness and hardness. Ball or tip burnishing leads to the consolidation of the surface layers of both grades 100C6 and 16NC6 steels by work hardening. For each steel grade and its mechanical treatment, the rational tensile curve has been drawn. Lüdwick's law is used to better plot the work hardening curve. For both grades, a material hardening law is established. For 100C6 steel, these results show a work hardening coefficient and a consolidation rate of 0.513 and 44, respectively, compared to the surface layers processed by turning. When 16NC6 steel is processed, the work hardening coefficient is about 0.29. Hardness tests characterize well the burnished depth. The layer affected by work hardening can reach up to 0.4 mm. Simulation of the tests is of great importance to provide the details at the local scale of the material. Conventional tensile curves provide a satisfactory indication of the toughness of 100C6 and 16NC6 materials. A simulation of the tensile curves revealed good agreement between the experimental and simulation results for both steels.Keywords: 100C6 steel, 16NC6 steel, burnishing, work hardening, roughness, hardness
Procedia PDF Downloads 1671443 The Effectiveness of Teaching Emotional Intelligence on Reducing Marital Conflicts and Marital Adjustment in Married Students of Tehran University
Authors: Elham Jafari
Abstract:
The aim of this study was to evaluate the effectiveness of emotional intelligence training on reducing marital conflict and marital adjustment in married students of the University of Tehran. This research is an applied type in terms of purpose and a semi-experimental design of pre-test-post-test type with the control group and with follow-up test in terms of the data collection method. The statistical population of the present study consisted of all married students of the University of Tehran. In this study, 30 married students of the University of Tehran were selected by convenience sampling method as a sample that 15 people in the experimental group and 15 people in the control group were randomly selected. The method of data collection in this research was field and library. The data collection tool in the field section was two questionnaires of marital conflict and marital adjustment. To analyze the collected data, first at the descriptive level, using statistical indicators, the demographic characteristics of the sample were described by SPSS software. In inferential statistics, the statistical method used was the test of analysis of covariance. The results showed that the effect of the independent variable of emotional intelligence on the reduction of marital conflicts is statistically significant. And it can be inferred that emotional intelligence training has reduced the marital conflicts of married students of the University of Tehran in the experimental group compared to the control group. Also, the effect of the independent variable of emotional intelligence on marital adjustment was statistically significant. It can be inferred that emotional intelligence training has adjusted the marital adjustment of married students of the University of Tehran in the experimental group compared to the control group.Keywords: emotional intelligence, marital conflicts, marital compatibility, married students
Procedia PDF Downloads 2511442 "Empowering Minds and Unleashing Curiosity: DIY Biotechnology for High School Students in the Age of Distance Learning"
Authors: Victor Hugo Sanchez Rodriguez
Abstract:
Amidst the challenges posed by pandemic-induced lockdowns, traditional educational models have been disrupted. To bridge the distance learning gap, our project introduces an innovative initiative focused on teaching high school students basic biotechnology techniques. We aim to empower young minds and foster curiosity by encouraging students to create their own DIY biotechnology laboratories using easily accessible materials found at home. This abstract outlines the key aspects of our project, highlighting its importance, methodology, and evaluation approach.In response to the pandemic's limitations, our project targets the delivery of biotechnology education at a distance. By engaging students in hands-on experiments, we seek to provide an enriching learning experience despite the constraints of remote learning. The DIY approach allows students to explore scientific concepts in a practical and enjoyable manner, nurturing their interest in biotechnology and molecular biology. Originally designed to assess professional-level research programs, we have adapted the URSSA to suit the context of biotechnology and molecular biology synthesis for high school students. By applying this tool before and after the experimental sessions, we aim to gauge the program's impact on students' learning experiences and skill development. Our project's significance lies not only in its novel approach to teaching biotechnology but also in its adaptability to the current global crisis. By providing students with a stimulating and interactive learning environment, we hope to inspire educators and institutions to embrace creative solutions during challenging times. Moreover, the insights gained from our evaluation will inform future efforts to enhance distance learning programs and promote accessible science education.Keywords: DIY biotechnology, high school students, distance learning, pandemic education, undergraduate research student self-assessment (URSSA)
Procedia PDF Downloads 661441 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks
Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar
Abstract:
DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)
Procedia PDF Downloads 3161440 Evaluation of Solid-Gas Separation Efficiency in Natural Gas Cyclones
Authors: W. I. Mazyan, A. Ahmadi, M. Hoorfar
Abstract:
Objectives/Scope: This paper proposes a mathematical model for calculating the solid-gas separation efficiency in cyclones. This model provides better agreement with experimental results compared to existing mathematical models. Methods: The separation ratio efficiency, ϵsp, is evaluated by calculating the outlet to inlet count ratio. Similar to mathematical derivations in the literature, the inlet and outlet particle count were evaluated based on Eulerian approach. The model also includes the external forces acting on the particle (i.e., centrifugal and drag forces). In addition, the proposed model evaluates the exact length that the particle travels inside the cyclone for the evaluation of number of turns inside the cyclone. The separation efficiency model derivation using Stoke’s law considers the effect of the inlet tangential velocity on the separation performance. In cyclones, the inlet velocity is a very important factor in determining the performance of the cyclone separation. Therefore, the proposed model provides accurate estimation of actual cyclone separation efficiency. Results/Observations/Conclusion: The separation ratio efficiency, ϵsp, is studied to evaluate the performance of the cyclone for particles ranging from 1 microns to 10 microns. The proposed model is compared with the results in the literature. It is shown that the proposed mathematical model indicates an error of 7% between its efficiency and the efficiency obtained from the experimental results for 1 micron particles. At the same time, the proposed model gives the user the flexibility to analyze the separation efficiency at different inlet velocities. Additive Information: The proposed model determines the separation efficiency accurately and could also be used to optimize the separation efficiency of cyclones at low cost through trial and error testing, through dimensional changes to enhance separation and through increasing the particle centrifugal forces. Ultimately, the proposed model provides a powerful tool to optimize and enhance existing cyclones at low cost.Keywords: cyclone efficiency, solid-gas separation, mathematical model, models error comparison
Procedia PDF Downloads 3901439 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components
Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich
Abstract:
This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.Keywords: hard disk drive, line balancing, ECRS, simulation, arena program
Procedia PDF Downloads 2241438 Digitally Mapping Aboriginal Journey Ways
Authors: Paul Longley Arthur
Abstract:
This paper reports on an Australian Research Council-funded project utilising the Australian digital research infrastructure the ‘Time-Layered Cultural Map of Australia’ (TLCMap) (https://www.tlcmap.org/) [1]. This resource has been developed to help researchers create digital maps from cultural, textual, and historical data, layered with datasets registered on the platform. TLCMap is a set of online tools that allows humanities researchers to compile humanities data using spatio-temporal coordinates – to upload, gather, analyse and visualise data. It is the only purpose-designed, Australian-developed research tool for humanities and social science researchers to identify geographical clusters and parallel journeys by sight. This presentation discusses a series of Aboriginal mapping and visualisation experiments using TLCMap to show how Indigenous knowledge can reconfigure contemporary understandings of space including the urbanised landscape [2, 3]. The research data being generated – investigating the historical movements of Aboriginal people, the distribution of networks, and their relation to land – lends itself to mapping and geo-spatial visualisation and analysis. TLCMap allows researchers to create layers on a 3D map which pinpoint locations with accompanying information, and this has enabled our research team to plot out traditional historical journeys undertaken by Aboriginal people as well as to compile a gazetteer of Aboriginal place names, many of which have largely been undocumented until now [4]. The documented journeys intersect with and overlay many of today’s urban formations including main roads, municipal boundaries, and state borders. The paper questions how such data can be incorporated into a more culturally and ethically responsive understanding of contemporary urban spaces and as well as natural environments [5].Keywords: spatio-temporal mapping, visualisation, Indigenous knowledge, mobility and migration, research infrastructure
Procedia PDF Downloads 161437 Formal Institutions and Women's Electoral Participation in Four European Countries
Authors: Sophia Francesca D. Lu
Abstract:
This research tried to produce evidence that formal institutions, such as electoral and internal party quotas, can advance women’s active roles in the public sphere using the cases of four European countries: Belgium, Germany, Italy, and the Netherlands. The quantitative dataset was provided by the University of Chicago and the Inter-University Consortium of Political and Social Research based on a two-year study (2008-2010) of political parties. Belgium engages in constitutionally mandated electoral quotas. Germany, Italy and the Netherlands, on the other hand, have internal party quotas, which are voluntarily adopted by political parties. In analyzing each country’s chi-square and Pearson’s r correlation, Belgium, having an electoral quota, is the only country that was analyzed for electoral quotas. Germany, Italy and the Netherlands’ internal voluntary party quotas were correlated with women’s descriptive representations. Using chi-square analysis, this study showed that the presence of electoral quotas is correlated with an increase in the percentage of women in decision-making bodies as well as with an increase in the percentage of women in decision-making bodies. Likewise, using correlational analysis, a higher number of political parties employing internal party voluntary quotas is correlated with an increase in the percentage of women occupying seats in parliament as well as an increase in the percentage of women nominees in electoral lists of political parties. In conclusion, gender quotas, such as electoral quotas or internal party quotas, are an effective policy tool for greater women’s representation in political bodies. Political parties and governments should opt to have gender quotas, whether electoral or internal party quotas, to address the underrepresentation of women in parliament, decision-making bodies, and policy-formulation.Keywords: electoral quota, Europe, formal institutions, institutional feminism, internal party quota, women’s electoral participation
Procedia PDF Downloads 4281436 Applications of the Morphological Variability in River Management: A Study of West Rapti River
Authors: Partha Sarathi Mondal, Srabani Sanyal
Abstract:
Different geomorphic agents produce a different landforms pattern. Similarly rivers also have a distinct and diverse landforms pattern. And even, within a river course different and distinct assemblage of landforms i.e. morphological variability are seen. These morphological variability are produced by different river processes. Channel and floodplain morphology helps to interpret river processes. Consequently morphological variability can be used as an important tool for assessing river processes, hydrological connectivity and river health, which will help us to draw inference about river processes and therefore, management of river health. The present study is documented on West Rapti river, a trans-boundary river flowing through Nepal and India, from its source to confluence with Ghaghra river in India. The river shows a significant morphological variability throughout its course. The present study tries to find out factors and processes responsible for the morphological variability of the river and in which way it can be applied in river management practices. For this purpose channel and floodplain morphology of West Rapti river was mapped as accurately as possible and then on the basis of process-form interactions, inferences are drawn to understand factors of morphological variability. The study shows that the valley setting of West Rapti river, in the Himalayan region, is confined and somewhere partly confined whereas, channel of the West Rapti river is single thread in most part of Himalayan region and braided in valley region. In the foothill region valley is unconfined and channel is braided, in middle part channel is meandering and valley is unconfined, whereas, channel is anthropogenically altered in the lower part of the course. Due to this the morphology of West Rapti river is highly diverse. These morphological variability are produced by different geomorphic processes. Therefore, for any river management it is essential to sustain these morphological variability so that the river could not cross the geomorphic threshold and environmental flow of the river along with the biodiversity of riparian region is maintained.Keywords: channel morphology, environmental flow, floodplain morphology, geomorphic threshold
Procedia PDF Downloads 3711435 Laboratory-Based Monitoring of Hepatitis B Virus Vaccination Status in North Central Nigeria
Authors: Nwadioha Samuel Iheanacho, Abah Paul, Odimayo Simidele Michael
Abstract:
Background: The World Health Assembly through the Global Health Sector Strategy on viral hepatitis calls for the elimination of viral hepatitis as a public health threat by 2030. All hands are on deck to actualize this goal through an effective and active vaccination and monitoring tool. Aim: To combine the Epidemiologic with Laboratory Hepatitis B Virus vaccination monitoring tools. Method: Laboratory results analysis of subjects recruited during the World Hepatitis week from July 2020 to July 2021 was done after obtaining their epidemiologic data on Hepatitis B virus risk factors, in the Medical Microbiology Laboratory of Benue State University Teaching Hospital, Nigeria. Result: A total of 500 subjects comprising males 60.0%(n=300/500) and females 40.0%(n=200/500) were recruited. A fifty-three percent majority was of the age range of 26 to 36 years. Serologic profiles were as follows, 15.0%(n=75/500) HBsAg; 7.0% (n=35/500) HBeAg; 8.0% (n=40/500) Anti-Hbe; 20.0% (n=100/500) Anti-HBc and 38.0% (n=190/500) Anti-HBs. Immune responses to vaccination were as follows, 47.0%(n=235/500) Immune naïve {no serologic marker + normal ALT}; 33%(n=165/500) Immunity by vaccination {Anti-HBs + normal ALT}; 5%(n=25/500) Immunity to previous infection {Anti-HBs, Anti-HBc, +/- Anti-HBe + normal ALT}; 8%(n=40/500) Carriers {HBsAg, Anti-HBc, Anti-HBe +normal ALT} and 7% (35/500) Anti-HBe serum- negative infections {HBsAg, HBeAg, Anti-HBc +elevated ALT}. Conclusion: The present 33.0% immunity by vaccination coverage in Central Nigeria was much lower than the 41.0% national peak in 2013, and a far cry from the global expectation of attainment of a Global Health Sector Strategy on the elimination of viral hepatitis as a public health threat by 2030. Therefore, more creative ideas and collective effort are needed to attain this goal of the World Health Assembly.Keywords: Hepatitis B, vaccination status, laboratory tools, resource-limited settings
Procedia PDF Downloads 721434 Chikungunya Virus Detection Utilizing an Origami Based Electrochemical Paper Analytical Device
Authors: Pradakshina Sharma, Jagriti Narang
Abstract:
Due to the critical significance in the early identification of infectious diseases, electrochemical sensors have garnered considerable interest. Here, we develop a detection platform for the chikungunya virus by rationally implementing the extremely high charge-transfer efficiency of a ternary nanocomposite of graphene oxide, silver, and gold (G/Ag/Au) (CHIKV). Because paper is an inexpensive substrate and can be produced in large quantities, the use of electrochemical paper analytical device (EPAD) origami further enhances the sensor's appealing qualities. A cost-effective platform for point-of-care diagnostics is provided by paper-based testing. These types of sensors are referred to as eco-designed analytical tools due to their efficient production, usage of the eco-friendly substrate, and potential to reduce waste management after measuring by incinerating the sensor. In this research, the paper's foldability property has been used to develop and create 3D multifaceted biosensors that can specifically detect the CHIKVX-ray diffraction, scanning electron microscopy, UV-vis spectroscopy, and transmission electron microscopy (TEM) were used to characterize the produced nanoparticles. In this work, aptamers are used since they are thought to be a unique and sensitive tool for use in rapid diagnostic methods. Cyclic voltammetry (CV) and linear sweep voltammetry (LSV), which were both validated with a potentiostat, were used to measure the analytical response of the biosensor. The target CHIKV antigen was hybridized with using the aptamer-modified electrode as a signal modulation platform, and its presence was determined by a decline in the current produced by its interaction with an anionic mediator, Methylene Blue (MB). Additionally, a detection limit of 1ng/ml and a broad linear range of 1ng/ml-10µg/ml for the CHIKV antigen were reported.Keywords: biosensors, ePAD, arboviral infections, point of care
Procedia PDF Downloads 921433 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals
Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar
Abstract:
Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks
Procedia PDF Downloads 1851432 Support for Reporting Guidelines in Surgical Journals Needs Improvement: A Systematic Review
Authors: Riaz A. Agha, Ishani Barai, Shivanchan Rajmohan, Seon Lee, Mohammed O. Anwar, Alex J. Fowler, Dennis P. Orgill, Douglas G. Altman
Abstract:
Introduction: Medical knowledge is growing fast. Evidence-based medicine works best if the evidence is reported well. Past studies have shown reporting quality to be lacking in the field of surgery. Reporting guidelines are an important tool for authors to optimize the reporting of their research. The objective of this study was to analyse the frequency and strength of recommendation for such reporting guidelines within surgical journals. Methods: A systematic review of the 198 journals within the Journal Citation Report 2014 (surgery category) published by Thomson Reuters was undertaken. The online guide for authors for each journal was screened by two independent groups and results were compared. Data regarding the presence and strength of recommendation to use reporting guidelines was extracted. Results: 193 journals were included (as five appeared twice having changed their name). These had a median impact factor of 1.526 (range 0.047 to 8.327), with a median of 145 articles published per journal (range 29-659), with 34,036 articles published in total over the two-year window 2012-2013. The majority (62%) of surgical journals made no mention of reporting guidelines within their guidelines for authors. Of the journals (38%) that did mention them, only 14% (10/73) required the use of all relevant reporting guidelines. The most frequently mentioned reporting guideline was CONSORT (46 journals). Conclusion: The mention of reporting guidelines within the guide for authors of surgical journals needs improvement. Authors, reviewers and editors should work to ensure that research is reported in line with the relevant reporting guidelines. Journals should consider hard-wiring adherence to them. This will allow peer-reviewers to focus on what is present, not what is missing, raising the level of scholarly discourse between authors and the scientific community and reducing frustration amongst readers.Keywords: CONSORT, guide for authors, PRISMA, reporting guidelines, journal impact factor, citation analysis
Procedia PDF Downloads 4641431 Semi-Automatic Segmentation of Mitochondria on Transmission Electron Microscopy Images Using Live-Wire and Surface Dragging Methods
Authors: Mahdieh Farzin Asanjan, Erkan Unal Mumcuoglu
Abstract:
Mitochondria are cytoplasmic organelles of the cell, which have a significant role in the variety of cellular metabolic functions. Mitochondria act as the power plants of the cell and are surrounded by two membranes. Significant morphological alterations are often due to changes in mitochondrial functions. A powerful technique in order to study the three-dimensional (3D) structure of mitochondria and its alterations in disease states is Electron microscope tomography. Detection of mitochondria in electron microscopy images due to the presence of various subcellular structures and imaging artifacts is a challenging problem. Another challenge is that each image typically contains more than one mitochondrion. Hand segmentation of mitochondria is tedious and time-consuming and also special knowledge about the mitochondria is needed. Fully automatic segmentation methods lead to over-segmentation and mitochondria are not segmented properly. Therefore, semi-automatic segmentation methods with minimum manual effort are required to edit the results of fully automatic segmentation methods. Here two editing tools were implemented by applying spline surface dragging and interactive live-wire segmentation tools. These editing tools were applied separately to the results of fully automatic segmentation. 3D extension of these tools was also studied and tested. Dice coefficients of 2D and 3D for surface dragging using splines were 0.93 and 0.92. This metric for 2D and 3D for live-wire method were 0.94 and 0.91 respectively. The root mean square symmetric surface distance values of 2D and 3D for surface dragging was measured as 0.69, 0.93. The same metrics for live-wire tool were 0.60 and 2.11. Comparing the results of these editing tools with the results of automatic segmentation method, it shows that these editing tools, led to better results and these results were more similar to ground truth image but the required time was higher than hand-segmentation timeKeywords: medical image segmentation, semi-automatic methods, transmission electron microscopy, surface dragging using splines, live-wire
Procedia PDF Downloads 1681430 Design and Analysis of Deep Excavations
Authors: Barham J. Nareeman, Ilham I. Mohammed
Abstract:
Excavations in urban developed area are generally supported by deep excavation walls such as; diaphragm wall, bored piles, soldier piles and sheet piles. In some cases, these walls may be braced by internal braces or tie back anchors. Tie back anchors are by far the predominant method for wall support, the large working space inside the excavation provided by a tieback anchor system has a significant construction advantage. This paper aims to analyze a deep excavation bracing system of contiguous pile wall braced by pre-stressed tie back anchors, which is a part of a huge residential building project, located in Turkey/Gaziantep province. The contiguous pile wall will be constructed with a length of 270 m that consists of 285 piles, each having a diameter of 80 cm, and a center to center spacing of 95 cm. The deformation analysis was carried out by a finite element analysis tool using PLAXIS. In the analysis, beam element method together with an elastic perfect plastic soil model and Soil Hardening Model was used to design the contiguous pile wall, the tieback anchor system, and the soil. The two soil clusters which are limestone and a filled soil were modelled with both Hardening soil and Mohr Coulomb models. According to the basic design, both soil clusters are modelled as drained condition. The simulation results show that the maximum horizontal movement of the walls and the maximum settlement of the ground are convenient with 300 individual case histories which are ranging between 1.2mm and 2.3mm for walls, and 15mm and 6.5mm for the settlements. It was concluded that tied-back contiguous pile wall can be satisfactorily modelled using Hardening soil model.Keywords: deep excavation, finite element, pre-stressed tie back anchors, contiguous pile wall, PLAXIS, horizontal deflection, ground settlement
Procedia PDF Downloads 2531429 The Survey of Sexual Health and Pornography among Divorce-Asking Women in West Azerbaijan-Iran: A Cross-Sectional Study
Authors: Soheila Rabiepoor, Elham Sadeghi
Abstract:
Introduction: Divorce is both a personal and a social issue. Nowadays, due to various factors such as rapid social, economical, and cultural changes, the family structure has undergone many rough changes, out of 3 marriages 2 of them lead to divorce. One of the factors affecting the incidence of divorce and relationship problems between couples is the sexual and marital behaviors. There are several different reasons to suspect that pornography might affect divorce in either a positive or a negative way. Therefore this study evaluated the sexual health of divorce-asking in Urmia, Iran. Methods: This was a cross-sectional descriptive study and was conducted on 71 married women of Urmia, Iran in 2016. Participants were applicants of divorce (referred to divorce center) who were selected by using convenient sampling method. Data gathering tool included the scales for measuring demographic, sexual health (sexual satisfaction and function), and researcher made pornography questions. Data were analyzed based on the SPSS 16 software. P-values less than 0.05 were considered significant. Results: Investigation of demographic features showed that age average of studied samples was 28.98 ± 7.44, with a marriage duration average 8.12 ± 6.53 years (min 1 year/ max 28 years). Most of their education was at diploma (45.1%). 69 % of the women declared their income and expenditure as equal. Nearly 42% of women and 59% of their partner had watched sexual pornography clips. 45.5% of participants reported that they compared own sexual relationship with sexual pornography clips. In the other hand, sexual satisfaction total score was 51.50 ± 17.92. The mean total sexual function score was 16.62 ± 10.58. According to these findings, most of women were experienced sexual dissatisfaction and dysfunction. Conclusions: The results of the study indicated that who had low sexual satisfaction score, had higher rate of watching pornography clips. Based on current study, paying attention to family education and counseling programs especially in the sexual field will be more fruitful.Keywords: divorce-asking, pornography, sexual satisfaction, sexual function, women
Procedia PDF Downloads 583