Search results for: robust estimators
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1554

Search results for: robust estimators

804 The Application of Insects in Forensic Investigations

Authors: Shirin Jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Forensic entomology is the science of study and analysis of insects evidences to aid in criminal investigation. Being aware of the distribution, biology, ecology and behavior of insects, which are founded at crime scene can provide information about when, where and how the crime has been committed. It has many application in criminal investigations. Its main use is estimation of the minimum time after death in suspicious death. The close association between insects and corpses and the use of insects in criminal investigations is the subject of forensic entomology. Because insects attack to the decomposing corpse and spawning on it from the initial stages. Forensic scientists can estimate the postmortem index by studying the insects population and the developing larval stages.In addition, toxicological and molecular studies of these insects can reveal the cause of death or even the identity of a victim. It also be used to detect drugs and poisons, and determination of incident location. Gathering robust entomological evidences is made possible for experts by recent Techniques. They can provide vital information about death, corpse movement or burial, submersion interval, time of decapitation, identification of specific sites of trauma, post-mortem artefacts on the body, use of drugs, linking a suspect to the scene of a crime, sexual molestations and the identification of suspects.

Keywords: Forensic entomology, post mortem interval, insects, larvae

Procedia PDF Downloads 503
803 Social Entrepreneurship and Inclusive Growth

Authors: Sudheer Gupta

Abstract:

Approximately 4 billion citizens of the world live on the equivalent of less than $8 a day. This segment constitutes a $5 trillion global market that remains under-served. Multinational corporations have historically tended to focus their innovation efforts on the upper segments of the economic pyramid. The academic literature has also been dominated by theories and frameworks of innovation that are valid when applied to the developed markets and consumer segments, but fail to adequately account for the challenges and realities of new product and service creation for the poor. Theories of entrepreneurship developed in the context of developed markets similarly ignore the challenges and realities of operating in developing economies that can be characterized by missing institutions, missing markets, information and infrastructural challenges, and resource constraints. Social entrepreneurs working in such contexts develop solutions differently. In this talk, we summarize lessons learnt from a long-term research project that involves data collection from a broad range of social entrepreneurs in developing countries working towards solutions to alleviate poverty, and grounded theory-building efforts. We aim to develop a better understanding of consumers, producers, and other stakeholder involvement, thus laying the foundation to build a robust theory of innovation and entrepreneurship for the poor.

Keywords: poverty alleviation, social enterprise, social innovation, development

Procedia PDF Downloads 399
802 Brain Tumor Detection and Classification Using Pre-Trained Deep Learning Models

Authors: Aditya Karade, Sharada Falane, Dhananjay Deshmukh, Vijaykumar Mantri

Abstract:

Brain tumors pose a significant challenge in healthcare due to their complex nature and impact on patient outcomes. The application of deep learning (DL) algorithms in medical imaging have shown promise in accurate and efficient brain tumour detection. This paper explores the performance of various pre-trained DL models ResNet50, Xception, InceptionV3, EfficientNetB0, DenseNet121, NASNetMobile, VGG19, VGG16, and MobileNet on a brain tumour dataset sourced from Figshare. The dataset consists of MRI scans categorizing different types of brain tumours, including meningioma, pituitary, glioma, and no tumour. The study involves a comprehensive evaluation of these models’ accuracy and effectiveness in classifying brain tumour images. Data preprocessing, augmentation, and finetuning techniques are employed to optimize model performance. Among the evaluated deep learning models for brain tumour detection, ResNet50 emerges as the top performer with an accuracy of 98.86%. Following closely is Xception, exhibiting a strong accuracy of 97.33%. These models showcase robust capabilities in accurately classifying brain tumour images. On the other end of the spectrum, VGG16 trails with the lowest accuracy at 89.02%.

Keywords: brain tumour, MRI image, detecting and classifying tumour, pre-trained models, transfer learning, image segmentation, data augmentation

Procedia PDF Downloads 74
801 Wearable Interface for Telepresence in Robotics

Authors: Uriel Martinez-Hernandez, Luke W. Boorman, Hamideh Kerdegari, Tony J. Prescott

Abstract:

In this paper, we present architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface, developed here, that provides the human with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, such as vision, with gaze control and tactile feedback. This allows for a straightforward integration of multiple sensory modalities, but also offers a more complete immersion experience for the human. These systems are integrated, controlled and synchronised by an architecture developed for telepresence and human-robot interaction. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with humans located in the remote environment. Our approach has been tested from local, domestic and business venues, using wired, wireless and Internet based connections. This has involved the implementation of data compression to maintain data quality to improve the immersion experience. Initial testing has shown the wearable interface to be robust. The system will endow humans with the ability to explore and interact with other humans at remote locations using multiple sensing modalities.

Keywords: telepresence, telerobotics, human-robot interaction, virtual reality

Procedia PDF Downloads 290
800 Robust Quantum Image Encryption Algorithm Leveraging 3D-BNM Chaotic Maps and Controlled Qubit-Level Operations

Authors: Vivek Verma, Sanjeev Kumar

Abstract:

This study presents a novel quantum image encryption algorithm, using a 3D chaotic map and controlled qubit-level scrambling operations. The newly proposed 3D-BNM chaotic map effectively reduces the degradation of chaotic dynamics resulting from the finite word length effect. It facilitates the generation of highly unpredictable random sequences and enhances chaotic performance. The system’s efficacy is additionally enhanced by the inclusion of a SHA-256 hash function. Initially, classical plain images are converted into their quantum equivalents using the Novel Enhanced Quantum Representation (NEQR) model. The Generalized Quantum Arnold Transformation (GQAT) is then applied to disrupt the coordinate information of the quantum image. Subsequently, to diffuse the pixel values of the scrambled image, XOR operations are performed using pseudorandom sequences generated by the 3D-BNM chaotic map. Furthermore, to enhance the randomness and reduce the correlation among the pixels in the resulting cipher image, a controlled qubit-level scrambling operation is employed. The encryption process utilizes fundamental quantum gates such as C-NOT and CCNOT. Both theoretical and numerical simulations validate the effectiveness of the proposed algorithm against various statistical and differential attacks. Moreover, the proposed encryption algorithm operates with low computational complexity.

Keywords: 3D Chaotic map, SHA-256, quantum image encryption, Qubit level scrambling, NEQR

Procedia PDF Downloads 11
799 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment

Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar

Abstract:

Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.

Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors

Procedia PDF Downloads 11
798 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

Authors: Ruchika Malhotra, Megha Khanna

Abstract:

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics

Procedia PDF Downloads 418
797 Facile Synthesis and Structure Characterization of Europium (III) Tungstate Nanoparticles

Authors: Mehdi Rahimi-Nasrabadi, Seied Mahdi Pourmortazavi

Abstract:

Taguchi robust design as a statistical method was applied for optimization of the process parameters in order to tunable, simple and fast synthesis of europium (III) tungstate nanoparticles. Europium (III) tungstate nanoparticles were synthesized by a chemical precipitation reaction involving direct addition of europium ion aqueous solution to the tungstate reagent solved in aqueous media. Effects of some synthesis procedure variables i.e., europium and tungstate concentrations, flow rate of cation reagent addition, and temperature of reaction reactor on the particle size of europium (III) tungstate nanoparticles were studied experimentally in order to tune particle size of europium (III) tungstate. Analysis of variance shows the importance of controlling tungstate concentration, cation feeding flow rate and temperature for preparation of europium (III) tungstate nanoparticles by the proposed chemical precipitation reaction. Finally, europium (III) tungstate nanoparticles were synthesized at the optimum conditions of the proposed method and the morphology and chemical composition of the prepared nano-material were characterized by means of X-Ray diffraction, scanning electron microscopy, transmission electron microscopy, FT-IR spectroscopy, and fluorescence.

Keywords: europium (III) tungstate, nano-material, particle size control, procedure optimization

Procedia PDF Downloads 395
796 Simultaneous Quantification of Glycols in New and Recycled Anti-Freeze Liquids by GC-MS

Authors: George Madalin Danila, Mihaiella Cretu, Cristian Puscasu

Abstract:

Glycol-based anti-freeze liquids, commonly composed of ethylene glycol or propylene glycol, have important uses in automotive cooling, but they should be handled with care due to their toxicity; ethylene glycol is highly toxic to humans and animals. A fast, accurate, precise, and robust method was developed for the simultaneous quantification of 7 most important glycols and their isomers. Glycols were analyzed from diluted sample solution of coolants using gas-chromatography coupled with mass spectrometry in single ion monitoring mode. Results: The method was developed and validated for 7 individual glycols (ethylene glycol, diethylene glycol, triethylene glycol, tetraethylene glycol, propylene glycol, dipropylene glycol and tripropylene glycol). Limits of detection (1-2 μg/mL) and limit of quantification (10 μg/mL) obtained were appropriate. The present method was applied for the determination of glycols in 10 different anti-freeze liquids commercially available on the Romanian market, proving to be reliable. A method that requires only a two-step dilution of anti-freeze samples combined with direct liquid injection GC-MS was validated for the simultaneous quantification of 7 glycols (and their isomers) in 10 different types of anti-freeze liquids. The results obtained in the validation procedure proved that the GC-MS method is sensitive and precise for the quantification of glycols.

Keywords: glycols, anti-freeze, gas-chromatography, mass spectrometry, validation, recycle

Procedia PDF Downloads 66
795 A Comparative Study on ANN, ANFIS and SVM Methods for Computing Resonant Frequency of A-Shaped Compact Microstrip Antennas

Authors: Ahmet Kayabasi, Ali Akdagli

Abstract:

In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.

Keywords: a-shaped compact microstrip antenna, artificial neural network (ANN), adaptive neuro-fuzzy inference system (ANFIS), support vector machine (SVM)

Procedia PDF Downloads 441
794 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 153
793 A Holistic Conceptual Measurement Framework for Assessing the Effectiveness and Viability of an Academic Program

Authors: Munir Majdalawieh, Adam Marks

Abstract:

In today’s very competitive higher education industry (HEI), HEIs are faced with the primary concern of developing, deploying, and sustaining high quality academic programs. Today, the HEI has well-established accreditation systems endorsed by a country’s legislation and institutions. The accreditation system is an educational pathway focused on the criteria and processes for evaluating educational programs. Although many aspects of the accreditation process highlight both the past and the present (prove), the “program review” assessment is "forward-looking assessment" (improve) and thus transforms the process into a continuing assessment activity rather than a periodic event. The purpose of this study is to propose a conceptual measurement framework for program review to be used by HEIs to undertake a robust and targeted approach to proactively and continuously review their academic programs to evaluate its practicality and effectiveness as well as to improve the education of the students. The proposed framework consists of two main components: program review principles and the program review measurement matrix.

Keywords: academic program, program review principles, curriculum development, accreditation, evaluation, assessment, review measurement matrix, program review process, information technologies supporting learning, learning/teaching methodologies and assessment

Procedia PDF Downloads 238
792 Genetic-Environment Influences on the Cognitive Abilities of 6-to-8 Years Old Twins

Authors: Annu Panghal, Bimla Dhanda

Abstract:

This research paper aims to determine the genetic-environment influences on the cognitive abilities of twins. Using the 100 pairs of twins from two districts, namely: Bhiwani (N = 90) and Hisar (N = 110) of Haryana State, genetic and environmental influences were assessed in twin study design. The cognitive abilities of twins were measured using the Wechsler Intelligence Scale for Children (WISC-R). Home Observation for Measurement of the Environment (HOME) Inventory was taken to examine the home environment of twins. Heritability estimate was used to analyze the genes contributing to shape the cognitive abilities of twins. The heritability estimates for cognitive abilities of 6-7 years old twins in Hisar district were 74% and in Bhiwani District 76%. Further the heritability estimates were 64% in the twins of Hisar district and 60 in Bhiwani district % in the age group of 7-8 years. The remaining variations in the cognitive abilities of twins were due to environmental factors namely: provision for Active Stimulation, paternal involvement, safe physical environment. The findings provide robust evidence that the cognitive abilities were more influenced by genes than the environmental factors and also revealed that the influence of genetic was more in the age group 6-7 years than the age group 7-8 years. The conclusion of the heritability estimates indicates that the genetic influence was more in the age group of 6-7 years than the age group of 7-8 years. As the age increases the genetic influence decreases and environment influence increases. Mother education was strongly associated with the cognitive abilities of twins.

Keywords: genetics, heritability, twins, environment, cognitive abilities

Procedia PDF Downloads 139
791 An Analysis of the Efficacy of Criminal Sanctions in Combating Cartel Conduct: The Case of South Africa

Authors: S. Tavuyanago

Abstract:

Cartels within the international competition law framework have been dubbed the most egregious of competition law violations; this is because they entail a concerted effort by two or more competitor firms to knowingly ‘rob’ consumers of their welfare through their cooperation instead of competition. The net effect of cartel conduct is that the market is distorted as the colluding firms gain enough market power to constrain the supply of goods or services, ultimately driving up prices. As a result, consumers end up paying inflated prices for goods and services, which eventually affects their welfare. It is against this backdrop that competition authorities worldwide have mounted a robust fight against the proliferation of cartels. In South Africa, the fight against cartels saw an amendment to the Competition Act to allow for criminal prosecution of individuals who cause their firms to take part in cartels. The Competition Amendment Act 1 of 2009 introduced section 73A into the principal Competition Act, making it a criminal offence to engage in cartel conduct. This paper assesses the rationale for criminalisation of cartel conduct, discusses the challenges or potential challenges associated with criminalisation, and provides an evaluation of the efficacy of criminalisation of cartel conduct. It questions whether criminal sanctions for cartel conduct as a competition enforcement tool aimed at deterring such conduct are generally effective and whether they have been effective in South Africa specifically. It concludes by offering recommendations on how to effectively root out cartels.

Keywords: cartels, criminalisation, competition, deterrence, South Africa

Procedia PDF Downloads 99
790 Environment-Specific Political Risk Discourse, Environmental Reputation, and Stock Price Crash Risk

Authors: Sohanur Rahman, Elisabeth Sinnewe, Larelle (Ellie) Chapple, Sarah Osborne

Abstract:

Greater political attention to global climate change exposes firms to a higher level of political uncertainty, which can lead to adverse capital market consequences. However, a higher level of discourse on environment-specific political risk (EPR) between management and investors can mitigate information asymmetry, followed by less stock price crash risk. This study examines whether EPR discourse in discourse in the earnings conference calls (ECC) reduces firm-level stock price crash risk in the US market. This research also explores if adverse disclosures via media channels further moderates the association between EPR on crash risk. Employing a dataset of 28,933 firm-year observations from 2002 to 2020, the empirical analysis reveals that EPR discourse in ECC reduces future stock price crash risk. However, adverse disclosures via media channels can offset the favourable effect of EPR discourse on crash risk. The results are robust to the potential endogeneity concern in a quasi-natural experiment setting.

Keywords: earnings conference calls, environment, environment-specific political risk discourse, environmental disclosures, information asymmetry, reputation risk, stock price crash risk

Procedia PDF Downloads 140
789 Selective and Highly Sensitive Measurement of ¹⁵NH₃ Using Photoacoustic Spectroscopy for Environmental Applications

Authors: Emily Awuor, Helga Huszar, Zoltan Bozoki

Abstract:

Isotope analysis has found numerous applications in the environmental science discipline, most common being the tracing of environmental contaminants on both regional and global scales. Many environmental contaminants contain ammonia (NH₃) since it is the most abundant gas in the atmosphere and its largest sources are from agricultural and industrial activities. NH₃ isotopes (¹⁴NH₃ and ¹⁵NH₃) are therefore important and can be used in the traceability studies of these atmospheric pollutants. The goal of the project is the construction of a photoacoustic spectroscopy system that is capable of measuring ¹⁵NH₃ isotope selectively in terms of its concentration. A further objective is for the system to be robust, easy-to-use, and automated. This is provided by using two telecommunication type near-infrared distributed feedback (DFB) diode lasers and a laser coupler as the light source in the photoacoustic measurement system. The central wavelength of the lasers in use was 1532 nm, with the tuning range of ± 1 nm. In this range, strong absorption lines can be found for both ¹⁴NH₃ and ¹⁵NH₃. For the selective measurement of ¹⁵NH₃, wavelengths were chosen where the cross effect of ¹⁴NH₃ and water vapor is negligible. We completed the calibration of the photoacoustic system, and as a result, the lowest detectable concentration was 3.32 ppm (3Ϭ) in the case of ¹⁵NH₃ and 0.44 ppm (3Ϭ) in the case of ¹⁴NH₃. The results are most useful in the environmental pollution measurement and analysis.

Keywords: ammonia isotope, near-infrared DFB diode laser, photoacoustic spectroscopy, environmental monitoring

Procedia PDF Downloads 148
788 A Machine Learning Pipeline for Real-Time Activity Detection on Low Computational Power Devices for Metaverse Applications

Authors: Amit Kumar, Amanpreet Chander, Ashish Sahani

Abstract:

This paper presents our recent work on real-time human activity detection based on the media pipe pipeline and machine learning algorithms. The proposed system can detect human activities, including running, jumping, squatting, bending to the left or right, and standing still. This is a robust solution for developing a yoga, dance, metaverse, and fitness application that checks for the correction of the pose without having any additional monitor like a personal trainer. MediaPipe solution offers an open-source cross-platform which utilizes a two-step detector-tracker ML pipeline for live detection of key landmarks on our body which can be used for motion data collection. The prediction of real-time poses uses a variety of machine learning techniques and different types of analysis. Without primarily relying on powerful desktop environments for inference, our method achieves real-time performance on the majority of contemporary mobile phones, desktops/laptops, Python, or even the web. Experimental results show that our method outperforms the existing method in terms of accuracy and real-time capability, achieving an accuracy of 99.92% on testing datasets.

Keywords: human activity detection, media pipe, machine learning, metaverse applications

Procedia PDF Downloads 179
787 A Group Setting of IED in Microgrid Protection Management System

Authors: Jyh-Cherng Gu, Ming-Ta Yang, Chao-Fong Yan, Hsin-Yung Chung, Yung-Ruei Chang, Yih-Der Lee, Chen-Min Chan, Chia-Hao Hsu

Abstract:

There are a number of distributed generations (DGs) installed in microgrid, which may have diverse path and direction of power flow or fault current. The overcurrent protection scheme for the traditional radial type distribution system will no longer meet the needs of microgrid protection. Integrating the intelligent electronic device (IED) and a supervisory control and data acquisition (SCADA) with IEC 61850 communication protocol, the paper proposes a microgrid protection management system (MPMS) to protect power system from the fault. In the proposed method, the MPMS performs logic programming of each IED to coordinate their tripping sequence. The GOOSE message defined in IEC 61850 is used as the transmission information medium among IEDs. Moreover, to cope with the difference in fault current of microgrid between grid-connected mode and islanded mode, the proposed MPMS applies the group setting feature of IED to protect system and robust adaptability. Once the microgrid topology varies, the MPMS will recalculate the fault current and update the group setting of IED. Provided there is a fault, IEDs will isolate the fault at once. Finally, the Matlab/Simulink and Elipse Power Studio software are used to simulate and demonstrate the feasibility of the proposed method.

Keywords: IEC 61850, IED, group Setting, microgrid

Procedia PDF Downloads 463
786 Cloud Support for Scientific Workflow Execution: Prototyping Solutions for Remote Sensing Applications

Authors: Sofiane Bendoukha, Daniel Moldt, Hayat Bendoukha

Abstract:

Workflow concepts are essential for the development of remote sensing applications. They can help users to manage and process satellite data and execute scientific experiments on distributed resources. The objective of this paper is to introduce an approach for the specification and the execution of complex scientific workflows in Cloud-like environments. The approach strives to support scientists during the modeling, the deployment and the monitoring of their workflows. This work takes advantage from Petri nets and more pointedly the so-called reference nets formalism, which provides a robust modeling/implementation technique. RENEWGRASS is a tool that we implemented and integrated into the Petri nets editor and simulator RENEW. It provides an easy way to support not experienced scientists during the specification of their workflows. It allows both modeling and enactment of image processing workflows from the remote sensing domain. Our case study is related to the implementation of vegetation indecies. We have implemented the Normalized Differences Vegetation Index (NDVI) workflow. Additionally, we explore the integration possibilities of the Cloud technology as a supplementary layer for the deployment of the current implementation. For this purpose, we discuss migration patterns of data and applications and propose an architecture.

Keywords: cloud computing, scientific workflows, petri nets, RENEWGRASS

Procedia PDF Downloads 447
785 Experimental Evaluation of Stand Alone Solar Driven Membrane Distillation System

Authors: Mejbri Sami, Zhani Khalifa, Zarzoum Kamel, Ben Bacha Habib, Koschikowski Joachim, Pfeifle Daniel

Abstract:

Many places worldwide, especially arid and semi-arid remote regions, are suffering from the lack of drinkable water and the situation will be aggravated in the near future. Furthermore, remote areas are characterised by lack of conventional energy sources, skilled personnel and maintenance facilities. Therefore, the development of small to medium size, stand-alone and robust solar desalination systems is needed to provide independent fresh water supply in remote areas. This paper is focused on experimental studies on compact membrane distillation (MD) solar desalination prototype located at the Mechanical Engineering Department site, Kairouan University, Kairouan, Tunisia. The pilot system is designed and manufactured as a part of a research and development project funded by the MESRS/BMBF. The pilot system is totally autonomous. The electrical energy required to operate the unit is generated through a field of 4 m² of photovoltaic panels, and the heating of feed water is provided by a field of 6 m² of solar collectors. The Kairouan plant performance of the first few months of operation is presented. The highest freshwater production of 150 L/d is obtained on a sunny day in July of 633 W/m²d.

Keywords: experimental, membrane distillation, solar desalination, Permeat gap

Procedia PDF Downloads 136
784 Religion and Sustainable Development: A Comparative Study of Buddhist and Christian Farmers’ Contribution to the Environmental Protection in Taiwan

Authors: Jijimon Alakkalam Joseph

Abstract:

The UN 2030 Agenda for Sustainable Development claims to be a comprehensive and integrated plan of action for prosperity for people and the planet, including almost all dimensions of human existence. Nevertheless, critics have pointed out the exclusion of the religious dimension from development discussions. Care for the earth is one of the vital aspects of sustainable development. Farmers all over the world contribute much to environmental protection. Most farmers are religious believers, and religious ideologies influence their agricultural practices. This nexus between faith and agriculture has forced policymakers to include religion in development discussions. This paper delves deeper into this religion and sustainable development connection. Buddhism and Christianity have contributed much to environmental protection in Taiwan. However, interviews conducted among 40 Taiwanese farmers (10 male and female farmers from Buddhism and Christianity) show that their faith experiences make them relate to the natural environment differently. Most of the Buddhist farmers interviewed admitted that they chose their religious adherence, while most of the Christian farmers inherited their faith. The in-depth analysis of the interview data collected underlines the close relationship between religion and sustainable development. More importantly, concerning their intention to care for the earth, farmers whose religious adherence is ‘chosen’ are self-motivated and more robust compared to those whose religious adherence is ‘inherited’.

Keywords: Buddhism, Christianity, environmental protection, sustainable development

Procedia PDF Downloads 84
783 The Contribution of Community Involvement in Heritage Management

Authors: Esraa Alhadad

Abstract:

Recently, there has been considerable debate surrounding the definition, conservation, and management of heritage. Over the past few years, there has been a growing call for the inclusion of local communities in heritage management. However, the perspectives on involvement, especially concerning key stakeholders like community members, often diverge significantly. While the theoretical foundation for community involvement is reasonably established, the application of this approach in heritage management has been sluggish. Achieving a balance to fulfill the diverse goals of stakeholders in any involvement project proves challenging in practice. Consequently, there is a dearth of empirical studies exploring the practical implications of effective tools in heritage management, and limited indication exists to persuade current authorities, such as governmental organizations, to share their influence with local community members. This research project delves into community involvement within heritage management as a potent means of constructing a robust management framework. Its objective is to assess both the extent and caliber of involvement within the management of heritage sites overall, utilizing a cultural mapping-centered methodology. The findings of this study underscore the significance of engaging the local community in both heritage management and planning endeavors. Ultimately, this investigation furnishes crucial empirical evidence and extrapolates valuable theoretical and practical insights that advance understanding of cultural mapping in pivotal areas, including the catalysts for involvement and collaborative decision-making processes.

Keywords: community involvement, heritage management, cultural mapping, stakeholder mangement

Procedia PDF Downloads 131
782 Differential Expression of Arc in the Mesocorticolimbic System Is Involved in Drug and Natural Rewarding Behavior in Rats

Authors: Yuhua Wang, Mu Li, Jinggen Liu

Abstract:

Aim: To investigate the different effects of heroin and milk in activating the corticostriatal system that plays a critical role in reward reinforcement learning. Methods: Male SD rats were trained daily for 15 d to self-administer heroin or milk tablets in a classic runway drug self-administration model. Immunohistochemical assay was used to quantify Arc protein expression in the medial prefrontal cortex (mPFC), the nucleus accumbens (NAc), the dorsomedial striatum (DMS) and the ventrolateral striatum (VLS) in response to chronic self-administration of heroin or milk tablets. NMDA receptor antagonist MK801 (0.1 mg/kg) or dopamine D1 receptor antagonist SCH23390 (0.03 mg/kg) were intravenously injected at the same time as heroin was infused intravenously. Results: Runway training with heroin resulted in robust enhancement of Arc expression in the mPFC, the NAc and the DMS on d 1, 7, and 15, and in the VLS on d 1 and d 7. However, runway training with milk led to increased Arc expression in the mPFC, the NAc and the DMS only on d 7 and/or d 15 but not on d 1. Moreover, runway training with milk failed to induce increased Arc protein in the VLS. Both heroin-seeking behavior and Arc protein expression were blocked by MK801 or SCH23390 administration. Conclusion: The VLS is likely to be critically involved in drug-seeking behavior. The NMDA and D1 receptor-dependent Arc expression is important in drug-seeking behavior.

Keywords: arc, mesocorticolimbic system, drug rewarding behavior, NMDA receptor

Procedia PDF Downloads 391
781 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.

Keywords: rule induction, decision table, missing data, noise

Procedia PDF Downloads 396
780 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models

Authors: Bipasha Sen, Aditya Agarwal

Abstract:

Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.

Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition

Procedia PDF Downloads 123
779 Catastrophic Health Expenditures: Evaluating the Effectiveness of Nepal's National Health Insurance Program Using Propensity Score Matching and Doubly Robust Methodology

Authors: Simrin Kafle, Ulrika Enemark

Abstract:

Catastrophic health expenditure (CHE) is a critical issue in low- and middle-income countries like Nepal, exacerbating financial hardship among vulnerable households. This study assesses the effectiveness of Nepal’s National Health Insurance Program (NHIP), launched in 2015, to reduce out-of-pocket (OOP) healthcare costs and mitigate CHE. Conducted in Pokhara Metropolitan City, the study used an analytical cross-sectional design, sampling 1276 households through a two-stage random sampling method. Data was collected via face-to-face interviews between May and October 2023. The analysis was conducted using SPSS version 29, incorporating propensity score matching to minimize biases and create comparable groups of enrolled and non-enrolled households in the NHIP. PSM helped reduce confounding effects by matching households with similar baseline characteristics. Additionally, a doubly robust methodology was employed, combining propensity score adjustment with regression modeling to enhance the reliability of the results. This comprehensive approach ensured a more accurate estimation of the impact of NHIP enrollment on CHE. Among the 1276 samples, 534 households (41.8%) were enrolled in NHIP. Of them, 84.3% of households renewed their insurance card, though some cited long waiting times, lack of medications, and complex procedures as barriers to renewal. Approximately 57.3% of households reported known diseases before enrollment, with 49.8% attending routine health check-ups in the past year. The primary motivation for enrollment was encouragement from insurance employees (50.2%). The data indicates that 12.5% of enrolled households experienced CHE versus 7.5% among non-enrolled. Enrollment into NHIP does not contribute to lower CHE (AOR: 1.98, 95% CI: 1.21-3.24). Key factors associated with increased CHE risk were presence of non-communicable diseases (NCDs) (AOR: 3.94, 95% CI: 2.10-7.39), acute illnesses/injuries (AOR: 6.70, 95% CI: 3.97-11.30), larger household size (AOR: 3.09, 95% CI: 1.81-5.28), and households below the poverty line (AOR: 5.82, 95% CI: 3.05-11.09). Other factors such as gender, education level, caste/ethnicity, presence of elderly members, and under-five children also showed varying associations with CHE, though not all were statistically significant. The study concludes that enrollment in the NHIP does not significantly reduce the risk of CHE. The reason for this could be inadequate coverage, where high-cost medicines, treatments, and transportation costs are not fully included in the insurance package, leading to significant out-of-pocket expenses. We also considered the long waiting time, lack of medicines, and complex procedures for the utilization of NHIP benefits, which might result in the underuse of covered services. Finally, gaps in enrollment and retention might leave certain households vulnerable to CHE despite the existence of NHIP. Key factors contributing to increased CHE include NCDs, acute illnesses, larger household sizes, and poverty. To improve the program’s effectiveness, it is recommended that NHIP benefits and coverage be expanded to better protect against high healthcare costs. Additionally, simplifying the renewal process, addressing long waiting times, and enhancing the availability of services could improve member satisfaction and retention. Targeted financial protection measures should be implemented for high-risk groups, and efforts should be made to increase awareness and encourage routine health check-ups to prevent severe health issues that contribute to CHE.

Keywords: catastrophic health expenditure, effectiveness, national health insurance program, Nepal

Procedia PDF Downloads 25
778 Ab Initio Study of Co2ZrGe and Co2NbB Full Heusler Compounds

Authors: A. Abada, S. Hiadsi, T. Ouahrani, B. Amrani, K. Amara

Abstract:

Using the first-principles full-potential linearized augmented plane wave plus local orbital (FP-LAPW+lo) method based on density functional theory (DFT), we have investigated the electronic structure and magnetism of some Co2- based full Heusler alloys, namely Co2ZrGe and Co2NbB. The calculations show that these compounds are to be half-metallic ferromagnets (HMFs) with a total magnetic moment of 2.000 µB per formula unit, well consistent with the Slater-Pauling rule. Our calculations show indirect band gaps of 0.58 eV and 0.47 eV in the minority spin channel of density of states (DOS) for Co2ZrGe and Co2NbB, respectively. Analysis of the DOS and magnetic moments indicates that their magnetism is mainly related to the d-d hybridization between the Co and Zr (or Nb) atoms. The half metallicity is found to be robust against volume changes and the two alloys kept a 100% of spin polarization at the Fermi level. In addition, an atom inside molecule AIM formalism and an electron localization function ELF were also adopted to study the bonding properties of these compounds, building a bridge between their electronic and bonding behavior. As they have a good crystallographic compatibility with the lattice of semiconductors used industrially and negative calculated cohesive energies with considerable absolute values these two alloys could be promising magnetic materials in the spintronics field.

Keywords: half-metallic ferromagnets, full Heusler alloys, magnetic properties, electronic properties

Procedia PDF Downloads 413
777 Green Supply Chain Management: A Revolutionary and Robust Innovation in the Field of Efficient Environmental Development and Regulation

Authors: Jinesh Kumar Jain, Faishal Pathan

Abstract:

The concept of sustainable development and effective environmental regulation has led to the emergence of a new field of study and practise that is the Green Supply Chain Management. GSCM has become a subject of great importance for both the developed and developing countries to achieve the desired and much-awaited goals of the firm within the environmental and sustainable framework. Its merits are comprised of good financial pay off and competitiveness to the firms in a long lasting and sustainable manner. The purpose of the paper is to briefly review the recent literature of the GSCM and also determines the new direction area of this emerging field. A detailed study has helped to enlighten the minute details and develop the research direction of the study. The GSCM has gained popularity with both academic and practitioners. The items for the study were developed based on the extent literature. Here we found that the state of adoption of GSCM practices by Indian Firms was still in its infancy, the awareness of environmental sustainability was quite low among consumers and the regulatory frameworks were also lacking in terms promoting environmental sustainability. The present paper is an attempt to emphasize much attention on the above-mentioned issues and present a conclusive summary to make its use widespread and for reaching.

Keywords: environmental management, environmental performance, financial performance, green supply chain management

Procedia PDF Downloads 215
776 Fused Structure and Texture (FST) Features for Improved Pedestrian Detection

Authors: Hussin K. Ragb, Vijayan K. Asari

Abstract:

In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.

Keywords: pedestrian detection, phase congruency, local phase, LBP features, CSLBP features, FST descriptor

Procedia PDF Downloads 488
775 Design of Multi-Loop Controller for Minimization of Energy Consumption in the Distillation Column

Authors: Vinayambika S. Bhat, S. Shanmuga Priya, I. Thirunavukkarasu, Shreeranga Bhat

Abstract:

An attempt has been made to design a decoupling controller for systems with more inputs more outputs with dead time in it. The de-coupler is designed for the chemical process industry 3×3 plant transfer function with dead time. The Quantitative Feedback Theory (QFT) based controller has also been designed here for the 2×2 distillation column transfer function. The developed control techniques were simulated using the MATLAB/Simulink. Also, the stability of the process was analyzed, together with the presence of various perturbations in it. Time domain specifications like setting time along with overshoot and oscillations were analyzed to prove the efficiency of the de-coupler method. The load disturbance rejection was tested along with its performance. The QFT control technique was synthesized based on the stability and performance specifications in the presence of uncertainty in time constant of the plant transfer function through sequential loop shaping technique. Further, the energy efficiency of the distillation column was improved by proper tuning of the controller. A distillation column consumes 3% of the total energy consumption of the world. A suitable control technique is very important from an economic point of view. The real time implementation of the process is under process in our laboratory.

Keywords: distillation, energy, MIMO process, time delay, robust stability

Procedia PDF Downloads 414