Search results for: western approach
10796 Videoconference Technology: An Attractive Vehicle for Challenging and Changing Tutors Practice in Open and Distance Learning Environment
Authors: Ramorola Mmankoko Ziphorah
Abstract:
Videoconference technology represents a recent experiment of technology integration into teaching and learning in South Africa. Increasingly, videoconference technology is commonly used as a substitute for the traditional face-to-face approaches to teaching and learning in helping tutors to reshape and change their teaching practices. Interestingly, though, some studies point out that videoconference technology is commonly used for knowledge dissemination by tutors and not so much for the actual teaching of course content in Open and Distance Learning context. Though videoconference technology has become one of the dominating technologies available among Open and Distance Learning institutions, it is not clear that it has been used as effectively to bridge the learning distance in time, geography, and economy. While tutors are prepared theoretically, in most tutor preparation programs, on the use of videoconference technology, there are still no practical guidelines on how they should go about integrating this technology into their course teaching. Therefore, there is an urgent need to focus on tutor development, specifically on their capacities and skills to use videoconference technology. The assumption is that if tutors become competent in the use of the videoconference technology for course teaching, then their use in Open and Distance Learning environment will become more commonplace. This is the imperative of the 4th Industrial Revolution (4IR) on education generally. Against the current vacuum in the practice of using videoconference technology for course teaching, the current study proposes a qualitative phenomenological approach to investigate the efficacy of videoconferencing as an approach to student learning. Using interviews and observation data from ten participants in Open and Distance Learning institution, the author discusses how dialogue and structure interacted to provide the participating tutors with a rich set of opportunities to deliver course content. The findings to this study highlight various challenges experienced by tutors when using videoconference technology. The study suggests tutor development programs on their capacity and skills and on how to integrate this technology with various teaching strategies in order to enhance student learning. The author argues that it is not merely the existence of the structure, namely the videoconference technology, that provides the opportunity for effective teaching, but that is the interactions, namely, the dialogue amongst tutors and learners that make videoconference technology an attractive vehicle for challenging and changing tutors practice.Keywords: open distance learning, transactional distance, tutor, videoconference
Procedia PDF Downloads 13110795 Smart Container Farming: Innovative Urban Strawberry Farming Model from Japan to the World
Authors: Nishantha Giguruwa
Abstract:
This research investigates the transformative potential of smart container farming, building upon the successful cultivation of Japanese mushrooms at Sakai Farms in Aichi Prefecture, Japan, under the strategic collaboration with the Daikei Group. Inspired by this success, the study focuses on establishing an advanced urban strawberry farming laboratory with the aim of understanding strawberry farming technologies, fostering collaboration, and strategizing marketing approaches for both local and global markets. Positioned within the business framework of Sakai Farms and the Daikei Group, the study underscores the sustainability and forward-looking solutions offered by smart container farming in agriculture. The global significance of strawberries is emphasized, acknowledging their economic and cultural importance. The detailed examination of strawberry farming intricacies informs the technological framework developed for smart containers, implemented at Sakai Farms. Integral to this research is the incorporation of controlled bee pollination, a groundbreaking addition to the smart container farming model. The study anticipates future trends, outlining avenues for continuing exploration, stakeholder collaborations, policy considerations, and expansion strategies. Notably, the author expresses a strategic intent to approach the global market, leveraging the foreign student/faculty base at Ritsumeikan Asia Pacific University, where the author is affiliated. This unique approach aims to disseminate the research findings globally, contributing to the broader landscape of agricultural innovation. The integration of controlled bee pollination within this innovative framework not only enhances sustainability but also marks a significant stride in the evolution of urban agriculture, aligning with global agricultural trends.Keywords: smart container farming, urban agriculture, strawberry farming technologies, controlled bee pollination, agricultural innovation
Procedia PDF Downloads 6010794 A Systems Approach to Targeting Cyclooxygenase: Genomics, Bioinformatics and Metabolomics Analysis of COX-1 -/- and COX-2-/- Lung Fibroblasts Providing Indication of Sterile Inflammation
Authors: Abul B. M. M. K. Islam, Mandar Dave, Roderick V. Jensen, Ashok R. Amin
Abstract:
A systems approach was applied to characterize differentially expressed transcripts, bioinformatics pathways, and proteins and prostaglandins (PGs) from lung fibroblasts procured from wild-type (WT), COX-1-/- and COX-2-/- mice to understand system level control mechanism. Bioinformatics analysis of COX-2 and COX-1 ablated cells induced COX-1 and COX-2 specific signature respectively, which significantly overlapped with an 'IL-1β induced inflammatory signature'. This defined novel cross-talk signals that orchestrated coordinated activation of pathways of sterile inflammation sensed by cellular stress. The overlapping signals showed significant over-representation of shared pathways for interferon y and immune responses, T cell functions, NOD, and toll-like receptor signaling. Gene Ontology Biological Process (GOBP) and pathway enrichment analysis specifically showed an increase in mRNA expression associated with: (a) organ development and homeostasis in COX-1-/- cells and (b) oxidative stress and response, spliceosomes and proteasomes activity, mTOR and p53 signaling in COX-2-/- cells. COX-1 and COX-2 showed signs of functional pathways committed to cell cycle and DNA replication at the genomics level. As compared to WT, metabolomics analysis revealed a significant increase in COX-1 mRNA and synthesis of basal levels of eicosanoids (PGE2, PGD2, TXB2, LTB4, PGF1α, and PGF2α) in COX-2 ablated cells and increase in synthesis of PGE2, and PGF1α in COX-1 null cells. There was a compensation of PGE2 and PGF1α in COX-1-/- and COX-2-/- cells. Collectively, these results support a broader, differential and collaborative regulation of both COX-1 and COX-2 pathways at the metabolic, signaling, and genomics levels in cellular homeostasis and sterile inflammation induced by cellular stress.Keywords: cyclooxygenases, inflammation, lung fibroblasts, systemic
Procedia PDF Downloads 29610793 Edge Enhancement Visual Methodology for Fat Amount and Distribution Assessment in Dry-Cured Ham Slices
Authors: Silvia Grassi, Stefano Schiavon, Ernestina Casiraghi, Cristina Alamprese
Abstract:
Dry-cured ham is an uncooked meat product particularly appreciated for its peculiar sensory traits among which lipid component plays a key role in defining quality and, consequently, consumers’ acceptability. Usually, fat content and distribution are chemically determined by expensive, time-consuming, and destructive analyses. Moreover, different sensory techniques are applied to assess product conformity to desired standards. In this context, visual systems are getting a foothold in the meat market envisioning more reliable and time-saving assessment of food quality traits. The present work aims at developing a simple but systematic and objective visual methodology to assess the fat amount of dry-cured ham slices, in terms of total, intermuscular and intramuscular fractions. To the aim, 160 slices from 80 PDO dry-cured hams were evaluated by digital image analysis and Soxhlet extraction. RGB images were captured by a flatbed scanner, converted in grey-scale images, and segmented based on intensity histograms as well as on a multi-stage algorithm aimed at edge enhancement. The latter was performed applying the Canny algorithm, which consists of image noise reduction, calculation of the intensity gradient for each image, spurious response removal, actual thresholding on corrected images, and confirmation of strong edge boundaries. The approach allowed for the automatic calculation of total, intermuscular and intramuscular fat fractions as percentages of the total slice area. Linear regression models were run to estimate the relationships between the image analysis results and the chemical data, thus allowing for the prediction of the total, intermuscular and intramuscular fat content by the dry-cured ham images. The goodness of fit of the obtained models was confirmed in terms of coefficient of determination (R²), hypothesis testing and pattern of residuals. Good regression models have been found being 0.73, 0.82, and 0.73 the R2 values for the total fat, the sum of intermuscular and intramuscular fat and the intermuscular fraction, respectively. In conclusion, the edge enhancement visual procedure brought to a good fat segmentation making the simple visual approach for the quantification of the different fat fractions in dry-cured ham slices sufficiently simple, accurate and precise. The presented image analysis approach steers towards the development of instruments that can overcome destructive, tedious and time-consuming chemical determinations. As future perspectives, the results of the proposed image analysis methodology will be compared with those of sensory tests in order to develop a fast grading method of dry-cured hams based on fat distribution. Therefore, the system will be able not only to predict the actual fat content but it will also reflect the visual appearance of samples as perceived by consumers.Keywords: dry-cured ham, edge detection algorithm, fat content, image analysis
Procedia PDF Downloads 18210792 Planning and Strategies for Risks Prevention, Mitigating, and Recovery of Ancient Theatres Heritage: Investigation and Recommendations
Authors: Naif A. Haddad
Abstract:
Greek, Hellenistic and Roman theatre heritage are exposed to multiple risks at varied times or simultaneously. There is no single reason why a theatre building becomes ‘at risk’, as each case has different circumstances which have led to the theatre building decay. There are complicated processes of destruction and distress that show divergence in theatre building materials' decay. Theatre modern use for cultural performances causes much of the risks concerning the physical structure and authenticity of theatre sites. In addition, there are some deterioration and deformations due to previous poor quality restorations and interventions through related excavation and conservation programmes as also risks to authenticity due to new additions. For preventive conservation, theatre natural and anthropogenic risks management can provide a framework for decision making. These risks to ancient theatre heritage may stem from exposure to one or more risk or synergy of many factors. We, therefore, need to link the theatre natural risks to the risks that come from anthropogenic factors associated with social and economic development. However, this requires a holistic approach, and systematic methodology for understanding these risks from various sources while incorporating specific actions, planning and strategies for each specific risk. Elaborating on recent relevant studies, and ERATO and ATHENA EU projects for ancient theaters and odea and general surveys, this paper attempts to discuss the main aspects of the ancient Greek, Hellenistic and Roman theatres risk related issues. Relevant case studies shall also be discussed and investigated to examine frameworks for risk mitigation, and related guidelines and recommendations that provide a systematic approach for sustainable management and planning in relation mainly to ‘compatible use’ of theatre sites.Keywords: cultural heritage management, European ancient theatres projects, Anthropogenic risks mitigation, sustainable management and planning, preventive conservation, modern use, compatible use
Procedia PDF Downloads 30110791 Numerical Analysis of Gas-Particle Mixtures through Pipelines
Authors: G. Judakova, M. Bause
Abstract:
The ability to model and simulate numerically natural gas flow in pipelines has become of high importance for the design of pipeline systems. The understanding of the formation of hydrate particles and their dynamical behavior is of particular interest, since these processes govern the operation properties of the systems and are responsible for system failures by clogging of the pipelines under certain conditions. Mathematically, natural gas flow can be described by multiphase flow models. Using the two-fluid modeling approach, the gas phase is modeled by the compressible Euler equations and the particle phase is modeled by the pressureless Euler equations. The numerical simulation of compressible multiphase flows is an important research topic. It is well known that for nonlinear fluxes, even for smooth initial data, discontinuities in the solution are likely to occur in finite time. They are called shock waves or contact discontinuities. For hyperbolic and singularly perturbed parabolic equations the standard application of the Galerkin finite element method (FEM) leads to spurious oscillations (e.g. Gibb's phenomenon). In our approach, we use stabilized FEM, the streamline upwind Petrov-Galerkin (SUPG) method, where artificial diffusion acting only in the direction of the streamlines and using a special treatment of the boundary conditions in inviscid convective terms, is added. Numerical experiments show that the numerical solution obtained and stabilized by SUPG captures discontinuities or steep gradients of the exact solution in layers. However, within this layer the approximate solution may still exhibit overshoots or undershoots. To suitably reduce these artifacts we add a discontinuity capturing or shock capturing term. The performance properties of our numerical scheme are illustrated for two-phase flow problem.Keywords: two-phase flow, gas-particle mixture, inviscid two-fluid model, euler equation, finite element method, streamline upwind petrov-galerkin, shock capturing
Procedia PDF Downloads 31510790 Historical Memory and Social Representation of Violence in Latin American Cinema: A Cultural Criminology Approach
Authors: Maylen Villamanan Alba
Abstract:
Latin America is marked by its history: conquest, colonialism, and slavery left deep footprints in most Latin American countries. Also, the past century has been affected by wars, military dictatorships, and political violence, which profoundly influenced Latin American popular culture. Consequently, reminiscences of historical crimes are frequently present in daily life, media, public opinion, and arts. This legacy is remembered in novels, paintings, songs, and films. In fact, Latin American cinema has a trend which refers to the verisimilitude with reality in fiction films. These films about historical violence are narrated as fictional characters, but their stories are based on real historical contexts. Therefore, cultural criminology has considered films as a significant field to understand social representations of violence related to historical crimes. The aim of the present contribution is to analyze the legacy of past and historical memory in social representations of violence in Latin American cinema as a critical approach to historical crimes. This qualitative research is based on content analysis. The sample is seven multi-award winning films of the International Festival of New Latin American Cinema of Havana. The films selected are Kamchatka, Argentina (2002); Carandiru, Brazil (2003); Enlightened by fire, Argentina (2005); Post-mortem, Chile (2010); No, Chile (2012) Wakolda; Argentina (2013) and The Clan, Argentina (2015). Cultural criminology highlights that cinema shapes meanings of social practices such as historical crimes. Critical criminology offers a critical theory framework to interpret Latin American cinema. This analysis reveals historical conditions deeply associated with power relationships, policy, and inequality issues. As indicated by this theory, violence is characterized as a structural process based on social asymmetries. These social asymmetries are crossed by social scopes, including institutional and personal dimensions. Thus, institutions of the states are depicted through personal stories of characters involved with human conflicts. Intimacy and social background are linked in the personages who simultaneously perform roles such as soldiers, policemen, professionals or inmates and they are at the same time depict as human beings with family, gender, racial, ideological or generational issues. Social representations of violence related to past legacy are a portrait of historical crimes perpetrated against Latin American citizens. Thereby, they have contributed to political positions, social behaviors, and public opinion. The legacy of these historical crimes suggests a path that should never be taken again. It means past legacy is a reminder, a warning, and a historic lesson for Latin American people. Social representations of violence are permeated by historical memory as denunciation under a critical approach.Keywords: Latin American cinema, historical memory, social representation, violence
Procedia PDF Downloads 15410789 The Essence and Attribution of Intellectual Property Rights Generated in the Digitization of Intangible Cultural Heritage
Authors: Jiarong Zhang
Abstract:
Digitizing intangible cultural heritage is a complex and comprehensive process from which sorts of intellectual property rights may be generated. Digitizing may be a repacking process of cultural heritage, which creates copyrights; recording folk songs and indigenous performances can create 'related rights'. At the same time, digitizing intangible cultural heritage may infringe the intellectual property rights of others unintentionally. Recording religious rituals of indigenous communities without authorization can violate the moral right of the ceremony participants of the community; making digital copies of rock paintings may infringe the right of reproduction. In addition, several parties are involved in the digitization process: indigenous peoples, museums, and archives can be holders of cultural heritage; companies and research institutions can be technology providers; internet platforms can be promoters and sellers; the public and groups above can be beneficiaries. When diverse intellectual property rights versus various parties, problems and disputes can arise easily. What are the types of intellectual property rights generated in the digitization process? What is the essence of these rights? Who should these rights belong to? How to use intellectual property to protect the digitalization of cultural heritage? How to avoid infringing on the intellectual property rights of others? While the digitization has been regarded as an effective approach to preserve intangible cultural heritage, related intellectual property issues have not received the attention and full discussion. Thus, parties involving in the digitization process may face intellectual property infringement lawsuits. The article will explore those problems from the intersection perspective of intellectual property law and cultural heritage. From a comparative approach, the paper will analysis related legal documents and cases, and shed some lights of those questions listed. The findings show, although there are no intellectual property laws targeting the cultural heritage in most countries, the involved stakeholders can seek protection from existing intellectual property rights following the suggestions of the article. The research will contribute to the digitization of intangible cultural heritage from a legal and policy aspect.Keywords: copyright, digitization, intangible cultural heritage, intellectual property, Internet platforms
Procedia PDF Downloads 15410788 Enhancing Residential Architecture through Generative Design: Balancing Aesthetics, Legal Constraints, and Environmental Considerations
Authors: Radul Shishkov
Abstract:
This research paper presents an in-depth exploration of the use of generative design in urban residential architecture, with a dual focus on aligning aesthetic values with legal and environmental constraints. The study aims to demonstrate how generative design methodologies can innovate residential building designs that are not only legally compliant and environmentally conscious but also aesthetically compelling. At the core of our research is a specially developed generative design framework tailored for urban residential settings. This framework employs computational algorithms to produce diverse design solutions, meticulously balancing aesthetic appeal with practical considerations. By integrating site-specific features, urban legal restrictions, and environmental factors, our approach generates designs that resonate with the unique character of urban landscapes while adhering to regulatory frameworks. The paper explores how modern digital tools, particularly computational design and algorithmic modelling, can optimize the early stages of residential building design. By creating a basic parametric model of a residential district, the paper investigates how automated design tools can explore multiple design variants based on predefined parameters (e.g., building cost, dimensions, orientation) and constraints. The paper aims to demonstrate how these tools can rapidly generate and refine architectural solutions that meet the required criteria for quality of life, cost efficiency, and functionality. The study utilizes computational design for database processing and algorithmic modelling within the fields of applied geodesy and architecture. It focuses on optimizing the forms of residential development by adjusting specific parameters and constraints. The results of multiple iterations are analyzed, refined, and selected based on their alignment with predefined quality and cost criteria. The findings of this research will contribute to a modern, complex approach to residential area design. The paper demonstrates the potential for integrating BIM models into the design process and their application in virtual 3D Geographic Information Systems (GIS) environments. The study also examines the transformation of BIM models into suitable 3D GIS file formats, such as CityGML, to facilitate the visualization and evaluation of urban planning solutions. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the investment during its entire lifecycle.Keywords: algorithmic modeling, architectural design, residential buildings, urban development, generative design, parametric models
Procedia PDF Downloads 2010787 Management and Genetic Characterization of Local Sheep Breeds for Better Productive and Adaptive Traits
Authors: Sonia Bedhiaf-Romdhani
Abstract:
The sheep (Ovis aries) was domesticated, approximately 11,000 years ago (YBP), in the Fertile Crescent from Asian Mouflon (Ovis Orientalis). The Northern African (NA) sheep is 7,000 years old, represents a remarkable diversity of sheep populations reared under traditional and low input farming systems (LIFS) over millennia. The majority of small ruminants in developing countries are encountered in low input production systems and the resilience of local communities in rural areas is often linked to the wellbeing of small ruminants. Regardless of the rich biodiversity encountered in sheep ecotypes there are four main sheep breeds in the country with 61,6 and 35.4 percents of Barbarine (fat tail breed) and Queue Fine de l’Ouest (thin tail breed), respectively. Phoenicians introduced the Barbarine sheep from the steppes of Central Asia in the Carthaginian period, 3000 years ago. The Queue Fine de l’Ouest is a thin-tailed meat breed heavily concentrated in the Western and the central semi-arid regions. The Noire de Thibar breed, involving mutton-fine wool producing animals, has been on the verge of extinction, it’s a composite black coated sheep breed found in the northern sub-humid region because of its higher nutritional requirements and non-tolerance of the prevailing harsher condition. The D'Man breed, originated from Morocco, is mainly located in the southern oases of the extreme arid ecosystem. A genetic investigation of Tunisian sheep breeds using a genome-wide scan of approximately 50,000 SNPs was performed. Genetic analysis of relationship between breeds highlighted the genetic differentiation of Noire de Thibar breed from the other local breeds, reflecting the effect of past events of introgression of European gene pool. The Queue Fine de l’Ouest breed showed a genetic heterogeneity and was close to Barbarine. The D'Man breed shared a considerable gene flow with the thin-tailed Queue Fine de l'Ouest breed. Native small ruminants breeds, are capable to be efficiently productive if essential ingredients and coherent breeding schemes are implemented and followed. Assessing the status of genetic variability of native sheep breeds could provide important clues for research and policy makers to devise better strategies for the conservation and management of genetic resources.Keywords: sheep, farming systems, diversity, SNPs.
Procedia PDF Downloads 14910786 Oat βeta Glucan Attenuates the Development of Atherosclerosis and Improves the Intestinal Barrier Function by Reducing Bacterial Endotoxin Translocation in APOE-/- MICE
Authors: Dalal Alghawas, Jetty Lee, Kaisa Poutanen, Hani El-Nezami
Abstract:
Oat β-glucan a water soluble non starch linear polysaccharide has been approved as a cholesterol lowering agent by various food safety administrations and is commonly used to reduce the risk of heart disease. The molecular weight of oat β-glucan can vary depending on the extraction and fractionation methods. It is not clear whether the molecular weight has a significant impact at reducing the acceleration of atherosclerosis. The aim of this study was to investigate three different oat β-glucan fractionations on the development of atherosclerosis in vivo. With special focus on plaque stability and the intestinal barrier function. To test this, ApoE-/- female mice were fed a high fat diet supplemented with oat bran, high molecular weight (HMW) oat β-glucan fractionate and low molecular weight (LMW) oat β-glucan fractionate for 16 weeks. Atherosclerosis risk markers were measured in the plasma, heart and aortic tree. Plaque size was measured in the aortic root and aortic tree. ICAM-1, VCAM-1, E-Selectin, P-Selectin, protein levels were assessed from the aortic tree to determine plaque stability at 16 weeks. The expression of p22phox at the aortic root was evaluated to study the NADPH oxidase complex involved in nitric oxide bioavailability and vascular elasticity. The tight junction proteins E-cadherin and beta-catenin from western blot analyses were analysed as an intestinal barrier function test. Plasma LPS, intestinal D-lactate levels and hepatic FMO gene expression were carried out to confirm whether the compromised intestinal barrier lead to endotoxemia. The oat bran and HMW oat β-glucan diet groups were more effective than the LMW β-glucan diet group at reducing the plaque size and showed marked improvements in plaque stability. The intestinal barrier was compromised for all the experimental groups however the endotoxemia levels were higher in the LMW β-glucan diet group. The oat bran and HMW oat β-glucan diet groups were more effective at attenuating the development of atherosclerosis. Reasons for this could be due to the LMW oat β-glucan diet group’s low viscosity in the gut and the inability to block the reabsorption of cholesterol. Furthermore the low viscosity may allow more bacterial endotoxin translocation through the impaired intestinal barrier. In future food technologists should carefully consider how to incorporate LMW oat β-glucan as a health promoting food.Keywords: Atherosclerosis, beta glucan, endotoxemia, intestinal barrier function
Procedia PDF Downloads 42710785 The Impact of the Flipped Classroom Instructional Model on MPharm Students in Two Pharmacy Schools in the UK
Authors: Mona Almanasef, Angel Chater, Jane Portlock
Abstract:
Introduction: A 'flipped classroom' uses technology to shift the traditional lecture outside the scheduled class time and uses the face-to-face time to engage students in interactive activities. Aim of the Study: Assess the feasibility, acceptability, and effectiveness of using the 'flipped classroom' teaching format with MPharm students in two pharmacy schools in the UK: UCL School of Pharmacy and the School of Pharmacy and Biomedical Sciences at University of Portsmouth. Methods: An experimental mixed methods design was employed, with final year MPharm students in two phases; 1) a qualitative study using focus groups, 2) a quasi-experiment measuring knowledge acquisition and satisfaction by delivering a session on rheumatoid arthritis, in two teaching formats: the flipped classroom and the traditional lecture. Results: The flipped classroom approach was preferred over the traditional lecture for delivering a pharmacy practice topic, and it was comparable or better than the traditional lecture with respect to knowledge acquisition. In addition, this teaching approach was found to overcome the perceived challenges of the traditional lecture method such as fast pace instructions, student disengagement and boredom due to lack of activities and/or social anxiety. However, high workload and difficult or new concepts could be barriers to pre-class preparation, and therefore successful flipped classroom. The flipped classroom encouraged learning scaffolding where students could benefit from application of knowledge, and interaction with peers and the lecturer, which might, in turn, facilitate learning consolidation and deep understanding. This research indicated that the flipped classroom was beneficial for all learning styles. Conclusion: Implementing the flipped classroom at both pharmacy institutions was successful and well received by final year MPharm students. Given the attention now being put on the Teaching Excellence Framework (TEF), understanding effective methods of teaching to enhance student achievement and satisfaction is now more valuable than ever.Keywords: blended learning, flipped classroom, inverted classroom, pharmacy education
Procedia PDF Downloads 13910784 A Distributed Mobile Agent Based on Intrusion Detection System for MANET
Authors: Maad Kamal Al-Anni
Abstract:
This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).Keywords: Intrusion Detection System (IDS), Mobile Adhoc Networks (MANET), Back Propagation Algorithm (BPA), Neural Networks (NN)
Procedia PDF Downloads 19710783 Machine Learning Approaches Based on Recency, Frequency, Monetary (RFM) and K-Means for Predicting Electrical Failures and Voltage Reliability in Smart Cities
Authors: Panaya Sudta, Wanchalerm Patanacharoenwong, Prachya Bumrungkun
Abstract:
As With the evolution of smart grids, ensuring the reliability and efficiency of electrical systems in smart cities has become crucial. This paper proposes a distinct approach that combines advanced machine learning techniques to accurately predict electrical failures and address voltage reliability issues. This approach aims to improve the accuracy and efficiency of reliability evaluations in smart cities. The aim of this research is to develop a comprehensive predictive model that accurately predicts electrical failures and voltage reliability in smart cities. This model integrates RFM analysis, K-means clustering, and LSTM networks to achieve this objective. The research utilizes RFM analysis, traditionally used in customer value assessment, to categorize and analyze electrical components based on their failure recency, frequency, and monetary impact. K-means clustering is employed to segment electrical components into distinct groups with similar characteristics and failure patterns. LSTM networks are used to capture the temporal dependencies and patterns in customer data. This integration of RFM, K-means, and LSTM results in a robust predictive tool for electrical failures and voltage reliability. The proposed model has been tested and validated on diverse electrical utility datasets. The results show a significant improvement in prediction accuracy and reliability compared to traditional methods, achieving an accuracy of 92.78% and an F1-score of 0.83. This research contributes to the proactive maintenance and optimization of electrical infrastructures in smart cities. It also enhances overall energy management and sustainability. The integration of advanced machine learning techniques in the predictive model demonstrates the potential for transforming the landscape of electrical system management within smart cities. The research utilizes diverse electrical utility datasets to develop and validate the predictive model. RFM analysis, K-means clustering, and LSTM networks are applied to these datasets to analyze and predict electrical failures and voltage reliability. The research addresses the question of how accurately electrical failures and voltage reliability can be predicted in smart cities. It also investigates the effectiveness of integrating RFM analysis, K-means clustering, and LSTM networks in achieving this goal. The proposed approach presents a distinct, efficient, and effective solution for predicting and mitigating electrical failures and voltage issues in smart cities. It significantly improves prediction accuracy and reliability compared to traditional methods. This advancement contributes to the proactive maintenance and optimization of electrical infrastructures, overall energy management, and sustainability in smart cities.Keywords: electrical state prediction, smart grids, data-driven method, long short-term memory, RFM, k-means, machine learning
Procedia PDF Downloads 6210782 Concrete Mixes for Sustainability
Authors: Kristyna Hrabova, Sabina Hüblova, Tomas Vymazal
Abstract:
Structural design of concrete structure has the result in qualities of structural safety and serviceability, together with durability, robustness, sustainability and resilience. A sustainable approach is at the heart of the research agenda around the world, and the Fibrillation Commission is also working on a new model code 2020. Now it is clear that the effects of mechanical, environmental load and even social coherence need to be reflected and included in the designing and evaluating structures. This study aimed to present the methodology for the sustainability assessment of various concrete mixtures.Keywords: concrete, cement, sustainability, Model Code 2020
Procedia PDF Downloads 18110781 The Detection of Implanted Radioactive Seeds on Ultrasound Images Using Convolution Neural Networks
Authors: Edward Holupka, John Rossman, Tye Morancy, Joseph Aronovitz, Irving Kaplan
Abstract:
A common modality for the treatment of early stage prostate cancer is the implantation of radioactive seeds directly into the prostate. The radioactive seeds are positioned inside the prostate to achieve optimal radiation dose coverage to the prostate. These radioactive seeds are positioned inside the prostate using Transrectal ultrasound imaging. Once all of the planned seeds have been implanted, two dimensional transaxial transrectal ultrasound images separated by 2 mm are obtained through out the prostate, beginning at the base of the prostate up to and including the apex. A common deep neural network, called DetectNet was trained to automatically determine the position of the implanted radioactive seeds within the prostate under ultrasound imaging. The results of the training using 950 training ultrasound images and 90 validation ultrasound images. The commonly used metrics for successful training were used to evaluate the efficacy and accuracy of the trained deep neural network and resulted in an loss_bbox (train) = 0.00, loss_coverage (train) = 1.89e-8, loss_bbox (validation) = 11.84, loss_coverage (validation) = 9.70, mAP (validation) = 66.87%, precision (validation) = 81.07%, and a recall (validation) = 82.29%, where train and validation refers to the training image set and validation refers to the validation training set. On the hardware platform used, the training expended 12.8 seconds per epoch. The network was trained for over 10,000 epochs. In addition, the seed locations as determined by the Deep Neural Network were compared to the seed locations as determined by a commercial software based on a one to three months after implant CT. The Deep Learning approach was within \strikeout off\uuline off\uwave off2.29\uuline default\uwave default mm of the seed locations determined by the commercial software. The Deep Learning approach to the determination of radioactive seed locations is robust, accurate, and fast and well within spatial agreement with the gold standard of CT determined seed coordinates.Keywords: prostate, deep neural network, seed implant, ultrasound
Procedia PDF Downloads 20610780 Colorful Ethnoreligious Map of Iraq and the Current Situation of Minorities in the Country
Authors: Meszár Tárik
Abstract:
The aim of the study is to introduce the minority groups living in Iraq and to shed light on their current situation. The Middle East is a rather heterogeneous region in ethnic terms. It includes many ethnic, national, religious, linguistic, or ethnoreligious groups. The relationship between the majority and minority is the main cause of various conflicts in the region. It seems that most of the post-Ottoman states have not yet developed a unified national identity capable of integrating their multi-ethnic societies. The issue of minorities living in the Middle East is highly politicized and controversial, as the various Arab states consider the treatment of minorities as their internal affair, do not recognize discrimination or even deny the existence of any kind of minorities on their territory. This attitude of the Middle Eastern states may also be due to the fact that the minority issue can be abused and can serve as a reference point for the intervention policies of Western countries at any time. Methodologically, the challenges of these groups are perceived through the manifestos of prominent individuals and organizations belonging to minorities. The basic aim is to present the minorities’ own history in dealing with the issue. It also introduces the different ethnic and religious minorities in Iraq and analyzes their situation during the operation of the terrorist organization „Islamic State” and in the aftermath. It is clear that the situation of these communities deteriorated significantly with the advance of ISIS, but it is also clear that even after the expulsion of the militant group, we cannot necessarily report an improvement in this area, especially in terms of the ability of minorities to assert their interests and physical security. The emergence of armed militias involved in the expulsion of ISIS sometimes has extremely negative effects on them. Until the interests of non-Muslims are adequately represented at the local level and in the legislature, most experts and advocates believe that little will change in their situation. When conflicts flare, many Iraqi citizens usually leave Iraq, but because of the poor public security situation (threats from terrorist organizations, interventions by other countries), emigration causes serious problems not only outside the country’s borders but also within the country. Another ominous implication for minorities is that their communities are very slow if ever, to return to their homes after fleeing their own settlements. An important finding of the study is that this phenomenon is changing the face of traditional Iraqi settlements and threatens to plunge groups that have lived there for thousands of years into the abyss of history. Therefore, we not only present the current situation of minorities living in Iraq but also discuss their future possibilities.Keywords: Middle East, Iraq, Islamic State, minorities
Procedia PDF Downloads 9210779 GPS Signal Correction to Improve Vehicle Location during Experimental Campaign
Authors: L. Della Ragione, G. Meccariello
Abstract:
In recent years the progress of the automobile industry in Italy in the field of reduction of emissions values is very remarkable. Nevertheless, their evaluation and reduction is a key problem, especially in the cities, which account for more than 50% of world population. In this paper we dealt with the problem of describing a quantitative approach for the reconstruction of GPS coordinates and altitude, in the context of correlation study between driving cycles / emission / geographical location, during an experimental campaign realized with some instrumented cars.Keywords: air pollution, driving cycles, GPS signal, vehicle location
Procedia PDF Downloads 43210778 Sinhala Sign Language to Grammatically Correct Sentences using NLP
Authors: Anjalika Fernando, Banuka Athuraliya
Abstract:
This paper presents a comprehensive approach for converting Sinhala Sign Language (SSL) into grammatically correct sentences using Natural Language Processing (NLP) techniques in real-time. While previous studies have explored various aspects of SSL translation, the research gap lies in the absence of grammar checking for SSL. This work aims to bridge this gap by proposing a two-stage methodology that leverages deep learning models to detect signs and translate them into coherent sentences, ensuring grammatical accuracy. The first stage of the approach involves the utilization of a Long Short-Term Memory (LSTM) deep learning model to recognize and interpret SSL signs. By training the LSTM model on a dataset of SSL gestures, it learns to accurately classify and translate these signs into textual representations. The LSTM model achieves a commendable accuracy rate of 94%, demonstrating its effectiveness in accurately recognizing and translating SSL gestures. Building upon the successful recognition and translation of SSL signs, the second stage of the methodology focuses on improving the grammatical correctness of the translated sentences. The project employs a Neural Machine Translation (NMT) architecture, consisting of an encoder and decoder with LSTM components, to enhance the syntactical structure of the generated sentences. By training the NMT model on a parallel corpus of Sinhala wrong sentences and their corresponding grammatically correct translations, it learns to generate coherent and grammatically accurate sentences. The NMT model achieves an impressive accuracy rate of 98%, affirming its capability to produce linguistically sound translations. The proposed approach offers significant contributions to the field of SSL translation and grammar correction. Addressing the critical issue of grammar checking, it enhances the usability and reliability of SSL translation systems, facilitating effective communication between hearing-impaired and non-sign language users. Furthermore, the integration of deep learning techniques, such as LSTM and NMT, ensures the accuracy and robustness of the translation process. This research holds great potential for practical applications, including educational platforms, accessibility tools, and communication aids for the hearing-impaired. Furthermore, it lays the foundation for future advancements in SSL translation systems, fostering inclusive and equal opportunities for the deaf community. Future work includes expanding the existing datasets to further improve the accuracy and generalization of the SSL translation system. Additionally, the development of a dedicated mobile application would enhance the accessibility and convenience of SSL translation on handheld devices. Furthermore, efforts will be made to enhance the current application for educational purposes, enabling individuals to learn and practice SSL more effectively. Another area of future exploration involves enabling two-way communication, allowing seamless interaction between sign-language users and non-sign-language users.In conclusion, this paper presents a novel approach for converting Sinhala Sign Language gestures into grammatically correct sentences using NLP techniques in real time. The two-stage methodology, comprising an LSTM model for sign detection and translation and an NMT model for grammar correction, achieves high accuracy rates of 94% and 98%, respectively. By addressing the lack of grammar checking in existing SSL translation research, this work contributes significantly to the development of more accurate and reliable SSL translation systems, thereby fostering effective communication and inclusivity for the hearing-impaired communityKeywords: Sinhala sign language, sign Language, NLP, LSTM, NMT
Procedia PDF Downloads 11010777 Lexical Semantic Analysis to Support Ontology Modeling of Maintenance Activities– Case Study of Offshore Riser Integrity
Authors: Vahid Ebrahimipour
Abstract:
Word representation and context meaning of text-based documents play an essential role in knowledge modeling. Business procedures written in natural language are meant to store technical and engineering information, management decision and operation experience during the production system life cycle. Context meaning representation is highly dependent upon word sense, lexical relativity, and sematic features of the argument. This paper proposes a method for lexical semantic analysis and context meaning representation of maintenance activity in a mass production system. Our approach constructs a straightforward lexical semantic approach to analyze facilitates semantic and syntactic features of context structure of maintenance report to facilitate translation, interpretation, and conversion of human-readable interpretation into computer-readable representation and understandable with less heterogeneity and ambiguity. The methodology will enable users to obtain a representation format that maximizes shareability and accessibility for multi-purpose usage. It provides a contextualized structure to obtain a generic context model that can be utilized during the system life cycle. At first, it employs a co-occurrence-based clustering framework to recognize a group of highly frequent contextual features that correspond to a maintenance report text. Then the keywords are identified for syntactic and semantic extraction analysis. The analysis exercises causality-driven logic of keywords’ senses to divulge the structural and meaning dependency relationships between the words in a context. The output is a word contextualized representation of maintenance activity accommodating computer-based representation and inference using OWL/RDF.Keywords: lexical semantic analysis, metadata modeling, contextual meaning extraction, ontology modeling, knowledge representation
Procedia PDF Downloads 11210776 Hybrid CNN-SAR and Lee Filtering for Enhanced InSAR Phase Unwrapping and Coherence Optimization
Authors: Hadj Sahraoui Omar, Kebir Lahcen Wahib, Bennia Ahmed
Abstract:
Interferometric Synthetic Aperture Radar (InSAR) coherence is a crucial parameter for accurately monitoring ground deformation and environmental changes. However, coherence can be degraded by various factors such as temporal decorrelation, atmospheric disturbances, and geometric misalignments, limiting the reliability of InSAR measurements (Omar Hadj‐Sahraoui and al. 2019). To address this challenge, we propose an innovative hybrid approach that combines artificial intelligence (AI) with advanced filtering techniques to optimize interferometric coherence in InSAR data. Specifically, we introduce a Convolutional Neural Network (CNN) integrated with the Lee filter to enhance the performance of radar interferometry. This hybrid method leverages the strength of CNNs to automatically identify and mitigate the primary sources of decorrelation, while the Lee filter effectively reduces speckle noise, improving the overall quality of interferograms. We develop a deep learning-based model trained on multi-temporal and multi-frequency SAR datasets, enabling it to predict coherence patterns and enhance low-coherence regions. This hybrid CNN-SAR with Lee filtering significantly reduces noise and phase unwrapping errors, leading to more precise deformation maps. Experimental results demonstrate that our approach improves coherence by up to 30% compared to traditional filtering techniques, making it a robust solution for challenging scenarios such as urban environments, vegetated areas, and rapidly changing landscapes. Our method has potential applications in geohazard monitoring, urban planning, and environmental studies, offering a new avenue for enhancing InSAR data reliability through AI-powered optimization combined with robust filtering techniques.Keywords: CNN-SAR, Lee Filter, hybrid optimization, coherence, InSAR phase unwrapping, speckle noise reduction
Procedia PDF Downloads 1610775 Enhancing Project Management Performance in Prefabricated Building Construction under Uncertainty: A Comprehensive Approach
Authors: Niyongabo Elyse
Abstract:
Prefabricated building construction is a pioneering approach that combines design, production, and assembly to attain energy efficiency, environmental sustainability, and economic feasibility. Despite continuous development in the industry in China, the low technical maturity of standardized design, factory production, and construction assembly introduces uncertainties affecting prefabricated component production and on-site assembly processes. This research focuses on enhancing project management performance under uncertainty to help enterprises navigate these challenges and optimize project resources. The study introduces a perspective on how uncertain factors influence the implementation of prefabricated building construction projects. It proposes a theoretical model considering project process management ability, adaptability to uncertain environments, and collaboration ability of project participants. The impact of uncertain factors is demonstrated through case studies and quantitative analysis, revealing constraints on implementation time, cost, quality, and safety. To address uncertainties in prefabricated component production scheduling, a fuzzy model is presented, expressing processing times in interval values. The model utilizes a cooperative co-evolution evolution algorithm (CCEA) to optimize scheduling, demonstrated through a real case study showcasing reduced project duration and minimized effects of processing time disturbances. Additionally, the research addresses on-site assembly construction scheduling, considering the relationship between task processing times and assigned resources. A multi-objective model with fuzzy activity durations is proposed, employing a hybrid cooperative co-evolution evolution algorithm (HCCEA) to optimize project scheduling. Results from real case studies indicate improved project performance in terms of duration, cost, and resilience to processing time delays and resource changes. The study also introduces a multistage dynamic process control model, utilizing IoT technology for real-time monitoring during component production and construction assembly. This approach dynamically adjusts schedules when constraints arise, leading to enhanced project management performance, as demonstrated in a real prefabricated housing project. Key contributions include a fuzzy prefabricated components production scheduling model, a multi-objective multi-mode resource-constrained construction project scheduling model with fuzzy activity durations, a multi-stage dynamic process control model, and a cooperative co-evolution evolution algorithm. The integrated mathematical model addresses the complexity of prefabricated building construction project management, providing a theoretical foundation for practical decision-making in the field.Keywords: prefabricated construction, project management performance, uncertainty, fuzzy scheduling
Procedia PDF Downloads 5510774 Prevalence and Intensity of Soil Transmitted Helminth Infections among the School Children in the State of Uttar Pradesh, India
Authors: Prasanta Saini, Junaid Jibran Jawed, Subrata Majumdar
Abstract:
Infections caused by soil-transmitted helminths (STH) are the major problem in all the nations of the world. The major focus of STH research is to study the prevalence of three major helminths, such as Ascaris, Trituris and hookworm. Here we are reporting the prevalence and intensity of the STH in the school children of the state of Uttar Pradesh, India. The aim of the study is to assess the prevalence and risk factors of STH infection among the school children, aged between 5-10 years in 27 districts randomly selected districts with covering nine agro-climatic zones of Uttar Pradesh, India. For this cross-sectional survey, we have selected the populations of government primary school going children in Uttar Pradesh. The sampling was performed in the nine different agro-climatic zones. Every individual of the study populations filled their daily information in the questioner's form and then the sample was collected and processed by kato-katz methods by following the guidelines of WHO. In this method, the sampling was performed in total of 6421 populations. A total of 6,421 children from 130 schools were surveyed. Infection with any soil-transmitted helminths was detected among 4,578 children with an overall prevalence of 75.6% (95% CI: 65.3-83.6). Among the 6421 population, the prevalence of Ascaris is 69.6% (95% CL 57.97-79.11), hookworm is 22.7% (95%CL 19.3-26.3) and Trichuris sp is 4.6% (95% CL 0.8-21.6), so the predicted prevalence map indicates that the STH infection was hyperendemic in this state. The findings of our survey in 130 schools covering 9 agro-climatic with one or more soil transmitted helminths. Majority of STH infections were of light intensity. STH infection was hyper-endemic in entire state, except three zones in western Uttar Pradesh. High prevalence ( > 75%) in all age groups also indicate little impact of existing deworming initiatives, including those among pre-school aged children. WHO recommends annual treatment in areas where STH prevalence is between 20% and 50%, and, a bi-annual treatment in areas with prevalence rates of over 50%. In view of high prevalence of STH infection in Uttar Pradesh, it is strongly recommended to initiate a deworming programme for school children in the state. Although our survey was among primary school children, high prevalence among children aged 4-6 years also indicates the need to strengthen the existing deworming programs for pre-school children. Extending the benefits of deworming to pre-school children through deworming in Anganwadi schools would further reduce to decrease the load of infection in community. As a long-term solution for control STH infection, it is also necessary to improve the sanitation levels in the area, as majority of the houses did not have latrines and most of the children were defecating in open fields, a factor that was found to be significantly associated with STH infection.Keywords: prevalence, school going children, soil transmitted helminthes, Uttar Pradesh-India
Procedia PDF Downloads 28010773 Identification of Blood Biomarkers Unveiling Early Alzheimer's Disease Diagnosis Through Single-Cell RNA Sequencing Data and Autoencoders
Authors: Hediyeh Talebi, Shokoofeh Ghiam, Changiz Eslahchi
Abstract:
Traditionally, Alzheimer’s disease research has focused on genes with significant fold changes, potentially neglecting subtle but biologically important alterations. Our study introduces an integrative approach that highlights genes crucial to underlying biological processes, regardless of their fold change magnitude. Alzheimer's Single-cell RNA-seq data related to the peripheral blood mononuclear cells (PBMC) was extracted from the Gene Expression Omnibus (GEO). After quality control, normalization, scaling, batch effect correction, and clustering, differentially expressed genes (DEGs) were identified with adjusted p-values less than 0.05. These DEGs were categorized based on cell-type, resulting in four datasets, each corresponding to a distinct cell type. To distinguish between cells from healthy individuals and those with Alzheimer's, an adversarial autoencoder with a classifier was employed. This allowed for the separation of healthy and diseased samples. To identify the most influential genes in this classification, the weight matrices in the network, which includes the encoder and classifier components, were multiplied, and focused on the top 20 genes. The analysis revealed that while some of these genes exhibit a high fold change, others do not. These genes, which may be overlooked by previous methods due to their low fold change, were shown to be significant in our study. The findings highlight the critical role of genes with subtle alterations in diagnosing Alzheimer's disease, a facet frequently overlooked by conventional methods. These genes demonstrate remarkable discriminatory power, underscoring the need to integrate biological relevance with statistical measures in gene prioritization. This integrative approach enhances our understanding of the molecular mechanisms in Alzheimer’s disease and provides a promising direction for identifying potential therapeutic targets.Keywords: alzheimer's disease, single-cell RNA-seq, neural networks, blood biomarkers
Procedia PDF Downloads 7010772 Pragmatic Interpretation in Translated Texts
Authors: Jamal Alqinai
Abstract:
A pragmatic approach to translation studies the rules and principles governing the use of language over and above the rules of syntax or morphology, and what makes some uses of language more appropriate than others in [communicative] situations. It attempts to explain translation as a procedure and product from the point of view of how, why and what is done by the source text author (ST) and what is to be done in the target text (TT) rendition. The latter will be subject to evaluation not as generated by the linguistics system but as conveyed and manipulated by participants in a communicative situation according to the referential and pragmatic standards employed. The failure of a purely lexical or structural translation stems from ignoring the relation between words as signs and the effect they have on their users. A more refined approach would also consider those processes that are sometimes labeled extra-linguistic or intuitive and which translators strive to reproduce unscathed in the translation process. We need to grasp the kind of actions an ST author performs on his readers by combining linguistic and non-linguistic elements against a backdrop of beliefs and cultural values. In other words, aside from considering the cohesive ties at the textual level, one needs to understand how the whole ST discourse hangs together logically in order to reproduce a coherent TT. The latter can only be achieved by an analysis of the pragmatic elements of presuppositions, implicatures and acts performed in the ST. Establishing cohesive ties within a text may require seeking reference outside the immediate text. The illocutionary functions manifested in one language/culture are relatively autonomous cultural/linguistic categories, but are imaginable by members of other cultures and, to some extent , are translatable though not, of course, without translation loss. Globalization and the spread of literacy worldwide may have created a universal empathy to comprehend the performative aspect of utterances when explained by approximate glosses or by paraphrase. Yet, it is often the multilayered and the culture-specific nature of illocutionary functions that de-universalize their possible interpretations. This paper addresses the pragmatic interpretation of culturally specific texts with examples adduced from a number of distinct settings to illustrate the influence of the pragmatic factors at stake.Keywords: pragmatic, presupposition, implicature, cohesion
Procedia PDF Downloads 1610771 Spatial and Temporal Analysis of Forest Cover Change with Special Reference to Anthropogenic Activities in Kullu Valley, North-Western Indian Himalayan Region
Authors: Krisala Joshi, Sayanta Ghosh, Renu Lata, Jagdish C. Kuniyal
Abstract:
Throughout the world, monitoring and estimating the changing pattern of forests across diverse landscapes through remote sensing is instrumental in understanding the interactions of human activities and the ecological environment with the changing climate. Forest change detection using satellite imageries has emerged as an important means to gather information on a regional scale. Kullu valley in Himachal Pradesh, India is situated in a transitional zone between the lesser and the greater Himalayas. Thus, it presents a typical rugged mountainous terrain with moderate to high altitude which varies from 1200 meters to over 6000 meters. Due to changes in agricultural cropping patterns, urbanization, industrialization, hydropower generation, climate change, tourism, and anthropogenic forest fire, it has undergone a tremendous transformation in forest cover in the past three decades. The loss and degradation of forest cover results in soil erosion, loss of biodiversity including damage to wildlife habitats, and degradation of watershed areas, and deterioration of the overall quality of nature and life. The supervised classification of LANDSAT satellite data was performed to assess the changes in forest cover in Kullu valley over the years 2000 to 2020. Normalized Burn Ratio (NBR) was calculated to discriminate between burned and unburned areas of the forest. Our study reveals that in Kullu valley, the increasing number of forest fire incidents specifically, those due to anthropogenic activities has been on a rise, each subsequent year. The main objective of the present study is, therefore, to estimate the change in the forest cover of Kullu valley and to address the various social aspects responsible for the anthropogenic forest fires. Also, to assess its impact on the significant changes in the regional climatic factors, specifically, temperature, humidity, and precipitation over three decades, with the help of satellite imageries and ground data. The main outcome of the paper, we believe, will be helpful for the administration for making a quantitative assessment of the forest cover area changes due to anthropogenic activities and devising long-term measures for creating awareness among the local people of the area.Keywords: Anthropogenic Activities, Forest Change Detection, Normalized Burn Ratio (NBR), Supervised Classification
Procedia PDF Downloads 17510770 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability
Procedia PDF Downloads 21010769 Heroic Villains: An Exploration of the Use of Narrative Plotlines and Emerging Identities within Recovery Stories of Former Substance Abusers
Authors: Tria Moore Aimee Walker-Clarke
Abstract:
The purpose of the study was to develop a deeper understanding of how self-identity is negotiated and reconstructed by people in recovery from substance abuse. The approach draws on the notion that self-identity is constructed through stories. Specifically, dominant narratives of substance abuse involve the 'addict identity' in which the meaning of being an addict is constructed though social interaction and informed by broader social meanings of substance misuse, which are considered deviant. The addict is typically understood as out of control, weak and feckless. Users may unconsciously embody this addict identity which makes recovery less likely. Typical approaches to treatment employ the notion that recovery is much more likely when users change the way they think and feel about themselves by assembling a new identity. Recovery, therefore, involves a reconstruction of the self in a new light, which may mean rejecting a part of the self (the addict identity). One limitation is that previous research on this topic has been quantitative which, while useful, tells us little about how this process is best managed. Should one, for example, reject the past addict identity completely and move on to the new identity, or, is it more effective to accept the past identity and use this in the formation of the new non-user identity? The purpose of this research, then, is to explore how addicts in recovery have managed the transition between their past and current selves and whether this may inform therapeutic practice. Using a narrative approach, data were analyzed from five in-depth interviews with former addicts who had been abstinent for at least a year, and who were in some form of volunteering role at substance treatment services in the UK. Although participants' identified with a previous ‘addict identity,’ and made efforts to disassociate themselves from this, they also recognized that acceptance was an important part of reconstructing their new identity. The participants' narratives used familiar plot lines to structure their stories, in which they positioned themselves as the heroes in their own stories, rather than as victim of circumstance. Instead of rejecting their former addict identity, which would mean rejecting a part of the self, participants used their experience in a reconstructive and restorative way. The findings suggest that encouraging people to tell their story and accept their addict identity are important factors in successful recovery.Keywords: addiction, identity, narrative, recovery, substance abuse
Procedia PDF Downloads 30710768 Reduce the Environmental Impacts of the Intensive Use of Glass in New Buildings in Khartoum, Sudan
Authors: Sawsan Domi
Abstract:
Khartoum is considering as one of the hottest cities all over the world, the mean monthly outdoor temperature remains above 30 ºC. Solar Radiation on Building Surfaces considered within the world highest values. Buildings in Khartoum is receiving huge amounts of watts/m2. Northern, eastern and western facades always receive a greater amount than the south ones. Therefore, these facades of the building must be better protected than the others. One of the most important design limits affecting indoor thermal comfort and energy conservation are building envelope design, self-efficiency in building materials and optical and thermo-physical properties of the building envelope. A small sun-facing glazing area is very important to provide thermal comfort in hot dry climates because of the intensive sunshine. This study aims to propose a work plan to help minimize the negative environmental effect of the climate on buildings taking the intensive use of glazing. In the last 15 years, there was a rapid growth in building sector in Khartoum followed by many of wrong strategies getting away of being environmental friendly. The intensive use of glazing on facades increased to commercial, industrial and design aspects, while the glass envelope led to quick increase in temperature by the reflection affects the sun on faces, cars and bodies. Logically, being transparent by using glass give a sense of open spaces, allowing natural lighting and sometimes natural ventilation keeping dust and insects away. In the other hand, it costs more and give more overheated. And this is unsuitable for a hot dry climate city like Khartoum. Many huge projects permitted every year from the Ministry of Planning in Khartoum state, with a design based on the intensive use of glazing on facades. There are no Laws or Regulations to control using materials in construction, the last building code -building code 2008- Khartoum state- only focused in using sustainable materials with no consider to any environmental aspects. Results of the study will help increase the awareness for architects, engineers and public about this environmentally problem. Objectives vary between Improve energy performance in buildings and Provide high levels of thermal comfort in the inner environment. As a future project, what are the changes that can happen in building permits codes and regulations. There could be recommendations for the governmental sector such as Obliging the responsible authorities to version environmental friendly laws in building construction fields and Support Renewable energy sector in buildings.Keywords: building envelope, building regulations, glazed facades, solar radiation
Procedia PDF Downloads 22410767 Turkey at the End of the Second Decade of the 21st Century: A Secular or Religious Country?
Authors: Francesco Pisano
Abstract:
Islam has been an important topic in Turkey’s institutional identity. Since the dawn of the Turkish Republic, at the end of the First World War, the new Turkish leadership was urged to deal with the religious heritage of the Sultanate. Mustafa Kemal Ataturk, Turkey’s first President, led the country in a process of internal change, substantially modifying not merely the democratic stance of it, but also the way politics was addressing the Muslim faith. Islam was banned from the public sector of the society and was drastically marginalized to the mere private sphere of citizens’ lives. Headscarves were banned from institutional buildings together with any other religious practice, while the country was proceeding down a path of secularism and Westernization. This issue is demonstrated by the fact that even a new elected Prime Minister, Recep Tayyip Erdoğan, was initially barred from taking the institutional position, because of allegations that he had read a religious text while campaigning. Over the years, thanks to this initial internal shift, Turkey has often been seen by Western partners as one of the few countries that had managed to find a perfect balance between a democratic stance and an Islamic inherent nature. In the early 2000s, this led many academics to believe that Ankara could eventually have become the next European capital. Since then, the internal and external landscape of Turkey has drastically changed. Today, religion has returned to be an important point of reference for Turkish politics, considering also the failure of the European negotiations and the always more unstable external environment of the country. This paper wants to address this issue, looking at the important role religion has covered in the Turkish society and the way it has been politicized since the early years of the Republic. It will evolve from a more theoretical debate on secularism and the path of political westernization of Turkey under Ataturk’s rule to a more practical analysis of today’s situation, passing through the failure of Ankara’s accession into the EU and the current tense political relation with its traditional NATO allies. The final objective of this research, therefore, is not to offer a meticulous opinion on Turkey’s current international stance. This issue will be left entirely to the personal consideration of the reader. Rather, it will supplement the existing literature with a comprehensive and more structured analysis on the role Islam has played on Turkish politics since the early 1920s up until the political domestic revolution of the early 2000s, after the first electoral win of the Justice and Development Party (AKP).Keywords: democracy, Islam, Mustafa Kemal Atatürk, Recep Tayyip Erdoğan, Turkey
Procedia PDF Downloads 212