Search results for: robust estimators
894 The Assessment of Some Biological Parameters With Dynamic Energy Budget of Mussels in Agadir Bay
Authors: Zahra Okba, Hassan El Ouizgani
Abstract:
Anticipating an individual’s behavior to the environmental factors allows for having relevant ecological forecasts. The Dynamic Energy Budget model facilitates prediction, and it is mechanically dependent on biology to abiotic factors but is generally field verified under relatively stable physical conditions. Dynamic Energy Budget Theory (DEB) is a robust framework that can link the individual state to environmental factors, and in our work, we have tested its ability to account for variability by looking at model predictions in the Agadir Bay, which is characterized by a semi-arid climate and temperature is strongly influenced by the trade winds front and nutritional availability. From previous works in our laboratory, we have collected different biological DEB model parameters of Mytilus galloprovincialis mussel in Agadir Bay. We mathematically formulated the equations that make up the DEB model and then adjusted our analytical functions with the observed biological data of our local species. We also assumed the condition of constant immersion, and then we integrated the details of the tidal cycles to calculate the metabolic depression at low tide. Our results are quite satisfactory concerning the length and shape of the shell in one part and the gonadosomatic index in another part.Keywords: dynamic energy budget, mussels, mytilus galloprovincialis, agadir bay, DEB model
Procedia PDF Downloads 114893 Market Illiquidity and Pricing Errors in the Term Structure of CDS
Authors: Lidia Sanchis-Marco, Antonio Rubia, Pedro Serrano
Abstract:
This paper studies the informational content of pricing errors in the term structure of sovereign CDS spreads. The residuals from a non-arbitrage model are employed to construct a Price discrepancy estimate, or noise measure. The noise estimate is understood as an indicator of market distress and reflects frictions such as illiquidity. Empirically, the noise measure is computed for an extensive panel of CDS spreads. Our results reveal an important fraction of systematic risk is not priced in default swap contracts. When projecting the noise measure onto a set of financial variables, the panel-data estimates show that greater price discrepancies are systematically related to a higher level of offsetting transactions of CDS contracts. This evidence suggests that arbitrage capital flows exit the marketplace during time of distress, and this consistent with a market segmentation among investors and arbitrageurs where professional arbitrageurs are particularly ineffective at bringing prices to their fundamental values during turbulent periods. Our empirical findings are robust for the most common CDS pricing models employed in the industry.Keywords: credit default swaps, noise measure, illiquidity, capital arbitrage
Procedia PDF Downloads 569892 MSIpred: A Python 2 Package for the Classification of Tumor Microsatellite Instability from Tumor Mutation Annotation Data Using a Support Vector Machine
Authors: Chen Wang, Chun Liang
Abstract:
Microsatellite instability (MSI) is characterized by high degree of polymorphism in microsatellite (MS) length due to a deficiency in mismatch repair (MMR) system. MSI is associated with several tumor types and its status can be considered as an important indicator for tumor prognostic. Conventional clinical diagnosis of MSI examines PCR products of a panel of MS markers using electrophoresis (MSI-PCR) which is laborious, time consuming, and less reliable. MSIpred, a python 2 package for automatic classification of MSI was released by this study. It computes important somatic mutation features from files in mutation annotation format (MAF) generated from paired tumor-normal exome sequencing data, subsequently using these to predict tumor MSI status with a support vector machine (SVM) classifier trained by MAF files of 1074 tumors belonging to four types. Evaluation of MSIpred on an independent 358-tumor test set achieved overall accuracy of over 98% and area under receiver operating characteristic (ROC) curve of 0.967. These results indicated that MSIpred is a robust pan-cancer MSI classification tool and can serve as a complementary diagnostic to MSI-PCR in MSI diagnosis.Keywords: microsatellite instability, pan-cancer classification, somatic mutation, support vector machine
Procedia PDF Downloads 173891 Functionalized Ultra-Soft Rubber for Soft Robotics Application
Authors: Shib Shankar Banerjeea, Andreas Ferya, Gert Heinricha, Amit Das
Abstract:
Recently, the growing need for the development of soft robots consisting of highly deformable and compliance materials emerge from the serious limitations of conventional service robots. However, one of the main challenges of soft robotics is to develop such compliance materials, which facilitates the design of soft robotic structures and, simultaneously, controls the soft-body systems, like soft artificial muscles. Generally, silicone or acrylic-based elastomer composites are used for soft robotics. However, mechanical performance and long-term reliabilities of the functional parts (sensors, actuators, main body) of the robot made from these composite materials are inferior. This work will present the development and characterization of robust super-soft programmable elastomeric materials from crosslinked natural rubber that can serve as touch and strain sensors for soft robotic arms with very high elastic properties and strain, while the modulus is altered in the kilopascal range. Our results suggest that such soft natural programmable elastomers can be promising materials and can replace conventional silicone-based elastomer for soft robotics applications.Keywords: elastomers, soft materials, natural rubber, sensors
Procedia PDF Downloads 155890 Application of Liquid Chromatographic Method for the in vitro Determination of Gastric and Intestinal Stability of Pure Andrographolide in the Extract of Andrographis paniculata
Authors: Vijay R. Patil, Sathiyanarayanan Lohidasan, K. R. Mahadik
Abstract:
Gastrointestinal stability of andrographolide was evaluated in vitro in simulated gastric (SGF) and intestinal (SIF) fluids using a validated HPLC-PDA method. The method was validated using a 5μm ThermoHypersil GOLD C18column (250 mm × 4.0 mm) and mobile phase consisting of water: acetonitrile; 70: 30 (v/v) delivered isocratically at a flow rate of 1 mL/min with UV detection at 228 nm. Andrographolide in pure form and extract Andrographis paniculata was incubated at 37°C in an incubator shaker in USP simulated gastric and intestinal fluids with and without enzymes. Systematic protocol as per FDA Guidance System was followed for stability study and samples were assayed at 0, 15, 30 and 60 min intervals for gastric and at 0, 15, 30, 60 min, 1, 2 and 3 h for intestinal stability study. Also, the stability study was performed up to 24 h to see the degradation pattern in SGF and SIF (with enzyme and without enzyme). The developed method was found to be accurate, precise and robust. Andrographolide was found to be stable in SGF (pH ∼ 1.2) for 1h and SIF (pH 6.8) up to 3 h. The relative difference (RD) of amount of drug added and found at all time points was found to be < 3%. The present study suggests that drug loss in the gastrointestinal tract takes place may be by membrane permeation rather than a degradation process.Keywords: andrographolide, Andrographis paniculata, in vitro, stability, gastric, Intestinal HPLC-PDA
Procedia PDF Downloads 243889 A Comparative Assessment of Membrane Bioscrubber and Classical Bioscrubber for Biogas Purification
Authors: Ebrahim Tilahun, Erkan Sahinkaya, Bariş Calli̇
Abstract:
Raw biogas is a valuable renewable energy source however it usually needs removal of the impurities. The presence of hydrogen sulfide (H2S) in the biogas has detrimental corrosion effects on the cogeneration units. Removal of H2S from the biogas can therefore significantly improve the biogas quality. In this work, a conventional bioscrubber (CBS), and a dense membrane bioscrubber (DMBS) were comparatively evaluated in terms of H2S removal efficiency (RE), CH4 enrichment and alkaline consumption at gas residence times ranging from 5 to 20 min. Both bioscrubbers were fed with a synthetic biogas containing H2S (1%), CO2 (39%) and CH4 (60%). The results show that high RE (98%) was obtained in the DMBS when gas residence time was 20 min, whereas slightly lower CO2 RE was observed. While in CBS system the outlet H2S concentration was always lower than 250 ppmv, and its H2S RE remained higher than 98% regardless of the gas residence time, although the high alkaline consumption and frequent absorbent replacement limited its cost-effectiveness. The result also indicates that in DMBS when the gas residence time increased to 20 min, the CH4 content in the treated biogas enriched upto 80%. However, while operating the CBS unit the CH4 content of the raw biogas (60%) decreased by three fold. The lower CH4 content in CBS was probably caused by extreme dilution of biogas with air (N2 and O2). According to the results obtained here the DMBS system is a robust and effective biotechnology in comparison with CBS. Hence, DMBS has a better potential for real scale applications.Keywords: biogas, bioscrubber, desulfurization, PDMS membrane
Procedia PDF Downloads 226888 Deep Graph Embeddings for the Analysis of Short Heartbeat Interval Time Series
Authors: Tamas Madl
Abstract:
Sudden cardiac death (SCD) constitutes a large proportion of cardiovascular mortalities, provides little advance warning, and the risk is difficult to recognize based on ubiquitous, low cost medical equipment such as the standard, 12-lead, ten second ECG. Autonomic abnormalities have been shown to be strongly predictive of SCD risk; yet current methods are not trivially applicable to the brevity and low temporal and electrical resolution of standard ECGs. Here, we build horizontal visibility graph representations of very short inter-beat interval time series, and perform unsuper- vised representation learning in order to convert these variable size objects into fixed-length vectors preserving similarity rela- tions. We show that such representations facilitate classification into healthy vs. at-risk patients on two different datasets, the Mul- tiparameter Intelligent Monitoring in Intensive Care II and the PhysioNet Sudden Cardiac Death Holter Database. Our results suggest that graph representation learning of heartbeat interval time series facilitates robust classification even in sequences as short as ten seconds.Keywords: sudden cardiac death, heart rate variability, ECG analysis, time series classification
Procedia PDF Downloads 234887 Robust Fractional Order Controllers for Minimum and Non-Minimum Phase Systems – Studies on Design and Development
Authors: Anand Kishore Kola, G. Uday Bhaskar Babu, Kotturi Ajay Kumar
Abstract:
The modern dynamic systems used in industries are complex in nature and hence the fractional order controllers have been contemplated as a fresh approach to control system design that takes the complexity into account. Traditional integer order controllers use integer derivatives and integrals to control systems, whereas fractional order controllers use fractional derivatives and integrals to regulate memory and non-local behavior. This study provides a method based on the maximumsensitivity (Ms) methodology to discover all resilient fractional filter Internal Model Control - proportional integral derivative (IMC-PID) controllers that stabilize the closed-loop system and deliver the highest performance for a time delay system with a Smith predictor configuration. Additionally, it helps to enhance the range of PID controllers that are used to stabilize the system. This study also evaluates the effectiveness of the suggested controller approach for minimum phase system in comparison to those currently in use which are based on Integral of Absolute Error (IAE) and Total Variation (TV).Keywords: modern dynamic systems, fractional order controllers, maximum-sensitivity, IMC-PID controllers, Smith predictor, IAE and TV
Procedia PDF Downloads 66886 Regulatory and Economic Challenges of AI Integration in Cyber Insurance
Authors: Shreyas Kumar, Mili Shangari
Abstract:
Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware
Procedia PDF Downloads 33885 Deep Learning and Accurate Performance Measure Processes for Cyber Attack Detection among Web Logs
Authors: Noureddine Mohtaram, Jeremy Patrix, Jerome Verny
Abstract:
As an enormous number of online services have been developed into web applications, security problems based on web applications are becoming more serious now. Most intrusion detection systems rely on each request to find the cyber-attack rather than on user behavior, and these systems can only protect web applications against known vulnerabilities rather than certain zero-day attacks. In order to detect new attacks, we analyze the HTTP protocols of web servers to divide them into two categories: normal attacks and malicious attacks. On the other hand, the quality of the results obtained by deep learning (DL) in various areas of big data has given an important motivation to apply it to cybersecurity. Deep learning for attack detection in cybersecurity has the potential to be a robust tool from small transformations to new attacks due to its capability to extract more high-level features. This research aims to take a new approach, deep learning to cybersecurity, to classify these two categories to eliminate attacks and protect web servers of the defense sector which encounters different web traffic compared to other sectors (such as e-commerce, web app, etc.). The result shows that by using a machine learning method, a higher accuracy rate, and a lower false alarm detection rate can be achieved.Keywords: anomaly detection, HTTP protocol, logs, cyber attack, deep learning
Procedia PDF Downloads 211884 Epidemiology, Knowledge, Attitude, and Practices among Patients of Stroke
Authors: Vijay nandmer, Ajay Nandmer
Abstract:
Stigmatized psycho-social perception poses a serious challenge and source of discrimination which impedes stroke patients from attaining a satisfactory quality of life. The present study was aimed to obtain information on knowledge, attitudes and practices (KAP) of stroke patients in the institute. We included 1000 people in our random sampling survey. Demographic details and responses to a questionnaire assessing the knowledge, attitude and practices were recorded. Although the majority of the patients belonged to low socioeconomic strata, the literacy rate was reasonably high (96.3%). A large majority (91.3%) of people had heard about stroke and (85.2%) knew that stroke can be treated with modern drugs. However, a negative attitude was reflected in the belief that stroke happens due to supernatural powers (hawa lagne se) (50.6%). Analysis of the data revealed regional differences in KAP which could be attributed to local Factors, such as literacy, awareness about stroke, and practice of different systems of medicine. Some of the differences can also be attributed to a category of study population whether it included patients or non-stroke individuals since the former are likely to have less negative attitudes than the public. There is a need to create awareness about stroke on a nation-wide basis to dispel the misconceptions and stigma through effective and robust programs with the aim to lessen the disease burden.Keywords: epidemiology, sroke, literacy, stroke
Procedia PDF Downloads 389883 Accuracy Improvement of Traffic Participant Classification Using Millimeter-Wave Radar by Leveraging Simulator Based on Domain Adaptation
Authors: Tokihiko Akita, Seiichi Mita
Abstract:
A millimeter-wave radar is the most robust against adverse environments, making it an essential environment recognition sensor for automated driving. However, the reflection signal is sparse and unstable, so it is difficult to obtain the high recognition accuracy. Deep learning provides high accuracy even for them in recognition, but requires large scale datasets with ground truth. Specially, it takes a lot of cost to annotate for a millimeter-wave radar. For the solution, utilizing a simulator that can generate an annotated huge dataset is effective. Simulation of the radar is more difficult to match with real world data than camera image, and recognition by deep learning with higher-order features using the simulator causes further deviation. We have challenged to improve the accuracy of traffic participant classification by fusing simulator and real-world data with domain adaptation technique. Experimental results with the domain adaptation network created by us show that classification accuracy can be improved even with a few real-world data.Keywords: millimeter-wave radar, object classification, deep learning, simulation, domain adaptation
Procedia PDF Downloads 93882 ArcGIS as a Tool for Infrastructure Documentation and Asset Management: Establishing a GIS for Computer Network Documentation
Authors: John Segars
Abstract:
Built out of a real-world need to have better, more detailed, asset and infrastructure documentation, this project will lay out the case for using the database functionality of ArcGIS as a tool to track and maintain infrastructure location, status, maintenance and serviceability. Workflows and processes will be presented and detailed which may be applied to an organizations’ infrastructure needs that might allow them to make use of the robust tools which surround the ArcGIS platform. The end result is a value-added information system framework with a geographic component e.g., the spatial location of various I.T. assets, a detailed set of records which not only documents location but also captures the maintenance history for assets along with photographs and documentation of these various assets as attachments to the numerous feature class items. In addition to the asset location and documentation benefits, the staff will be able to log into the devices and pull SNMP (Simple Network Management Protocol) based query information from within the user interface. The entire collection of information may be displayed in ArcGIS, via a JavaScript based web application or via queries to the back-end database. The project is applicable to all organizations which maintain an IT infrastructure but specifically targets post-secondary educational institutions where access to ESRI resources is generally already available in house.Keywords: ESRI, GIS, infrastructure, network documentation, PostgreSQL
Procedia PDF Downloads 181881 Multi-Objective Electric Vehicle Charge Coordination for Economic Network Management under Uncertainty
Authors: Ridoy Das, Myriam Neaimeh, Yue Wang, Ghanim Putrus
Abstract:
Electric vehicles are a popular transportation medium renowned for potential environmental benefits. However, large and uncontrolled charging volumes can impact distribution networks negatively. Smart charging is widely recognized as an efficient solution to achieve both improved renewable energy integration and grid relief. Nevertheless, different decision-makers may pursue diverse and conflicting objectives. In this context, this paper proposes a multi-objective optimization framework to control electric vehicle charging to achieve both energy cost reduction and peak shaving. A weighted-sum method is developed due to its intuitiveness and efficiency. Monte Carlo simulations are implemented to investigate the impact of uncertain electric vehicle driving patterns and provide decision-makers with a robust outcome in terms of prospective cost and network loading. The results demonstrate that there is a conflict between energy cost efficiency and peak shaving, with the decision-makers needing to make a collaborative decision.Keywords: electric vehicles, multi-objective optimization, uncertainty, mixed integer linear programming
Procedia PDF Downloads 179880 Domain Adaptation Save Lives - Drowning Detection in Swimming Pool Scene Based on YOLOV8 Improved by Gaussian Poisson Generative Adversarial Network Augmentation
Authors: Simiao Ren, En Wei
Abstract:
Drowning is a significant safety issue worldwide, and a robust computer vision-based alert system can easily prevent such tragedies in swimming pools. However, due to domain shift caused by the visual gap (potentially due to lighting, indoor scene change, pool floor color etc.) between the training swimming pool and the test swimming pool, the robustness of such algorithms has been questionable. The annotation cost for labeling each new swimming pool is too expensive for mass adoption of such a technique. To address this issue, we propose a domain-aware data augmentation pipeline based on Gaussian Poisson Generative Adversarial Network (GP-GAN). Combined with YOLOv8, we demonstrate that such a domain adaptation technique can significantly improve the model performance (from 0.24 mAP to 0.82 mAP) on new test scenes. As the augmentation method only require background imagery from the new domain (no annotation needed), we believe this is a promising, practical route for preventing swimming pool drowning.Keywords: computer vision, deep learning, YOLOv8, detection, swimming pool, drowning, domain adaptation, generative adversarial network, GAN, GP-GAN
Procedia PDF Downloads 101879 Italian Speech Vowels Landmark Detection through the Legacy Tool 'xkl' with Integration of Combined CNNs and RNNs
Authors: Kaleem Kashif, Tayyaba Anam, Yizhi Wu
Abstract:
This paper introduces a methodology for advancing Italian speech vowels landmark detection within the distinctive feature-based speech recognition domain. Leveraging the legacy tool 'xkl' by integrating combined convolutional neural networks (CNNs) and recurrent neural networks (RNNs), the study presents a comprehensive enhancement to the 'xkl' legacy software. This integration incorporates re-assigned spectrogram methodologies, enabling meticulous acoustic analysis. Simultaneously, our proposed model, integrating combined CNNs and RNNs, demonstrates unprecedented precision and robustness in landmark detection. The augmentation of re-assigned spectrogram fusion within the 'xkl' software signifies a meticulous advancement, particularly enhancing precision related to vowel formant estimation. This augmentation catalyzes unparalleled accuracy in landmark detection, resulting in a substantial performance leap compared to conventional methods. The proposed model emerges as a state-of-the-art solution in the distinctive feature-based speech recognition systems domain. In the realm of deep learning, a synergistic integration of combined CNNs and RNNs is introduced, endowed with specialized temporal embeddings, harnessing self-attention mechanisms, and positional embeddings. The proposed model allows it to excel in capturing intricate dependencies within Italian speech vowels, rendering it highly adaptable and sophisticated in the distinctive feature domain. Furthermore, our advanced temporal modeling approach employs Bayesian temporal encoding, refining the measurement of inter-landmark intervals. Comparative analysis against state-of-the-art models reveals a substantial improvement in accuracy, highlighting the robustness and efficacy of the proposed methodology. Upon rigorous testing on a database (LaMIT) speech recorded in a silent room by four Italian native speakers, the landmark detector demonstrates exceptional performance, achieving a 95% true detection rate and a 10% false detection rate. A majority of missed landmarks were observed in proximity to reduced vowels. These promising results underscore the robust identifiability of landmarks within the speech waveform, establishing the feasibility of employing a landmark detector as a front end in a speech recognition system. The synergistic integration of re-assigned spectrogram fusion, CNNs, RNNs, and Bayesian temporal encoding not only signifies a significant advancement in Italian speech vowels landmark detection but also positions the proposed model as a leader in the field. The model offers distinct advantages, including unparalleled accuracy, adaptability, and sophistication, marking a milestone in the intersection of deep learning and distinctive feature-based speech recognition. This work contributes to the broader scientific community by presenting a methodologically rigorous framework for enhancing landmark detection accuracy in Italian speech vowels. The integration of cutting-edge techniques establishes a foundation for future advancements in speech signal processing, emphasizing the potential of the proposed model in practical applications across various domains requiring robust speech recognition systems.Keywords: landmark detection, acoustic analysis, convolutional neural network, recurrent neural network
Procedia PDF Downloads 63878 Investigating The Nexus Between Energy Deficiency, Environmental Sustainability and Renewable Energy: The Role of Energy Trade in Global Perspectives
Authors: Fahim Ullah, Muhammad Usman
Abstract:
Energy consumption and environmental sustainability are hard challenges of 21st century. Energy richness increases environmental pollution while energy poverty hinders economic growth. Considering these two aspects, present study calculates energy deficiency and examines the role of renewable energy to overcome rising energy deficiency and carbon emission for selected countries from 1990 to 2021. For empirical analysis, this study uses methods of moments panel quantile regression analysis and to check the robustness, study used panel quantile robust analysis. Graphical analysis indicated rising global energy deficiency since last three decades where energy consumption is higher than energy production. Empirical results showed that renewable energy is a significant factor for reducing energy deficiency. Secondly, the energy deficiency increases carbon emission level and again renewable energy decreases emissions level. This study recommends that global energy deficiency and rising carbon emissions can be controlled through structural change in the form of energy transition to replace non-renewable resources with renewable resources.Keywords: energy deficiency, renewable energy, carbon emission, energy trade, PQL analysis
Procedia PDF Downloads 64877 Detection of High Fructose Corn Syrup in Honey by Near Infrared Spectroscopy and Chemometrics
Authors: Mercedes Bertotto, Marcelo Bello, Hector Goicoechea, Veronica Fusca
Abstract:
The National Service of Agri-Food Health and Quality (SENASA), controls honey to detect contamination by synthetic or natural chemical substances and establishes and controls the traceability of the product. The utility of near-infrared spectroscopy for the detection of adulteration of honey with high fructose corn syrup (HFCS) was investigated. First of all, a mixture of different authentic artisanal Argentinian honey was prepared to cover as much heterogeneity as possible. Then, mixtures were prepared by adding different concentrations of high fructose corn syrup (HFCS) to samples of the honey pool. 237 samples were used, 108 of them were authentic honey and 129 samples corresponded to honey adulterated with HFCS between 1 and 10%. They were stored unrefrigerated from time of production until scanning and were not filtered after receipt in the laboratory. Immediately prior to spectral collection, honey was incubated at 40°C overnight to dissolve any crystalline material, manually stirred to achieve homogeneity and adjusted to a standard solids content (70° Brix) with distilled water. Adulterant solutions were also adjusted to 70° Brix. Samples were measured by NIR spectroscopy in the range of 650 to 7000 cm⁻¹. The technique of specular reflectance was used, with a lens aperture range of 150 mm. Pretreatment of the spectra was performed by Standard Normal Variate (SNV). The ant colony optimization genetic algorithm sample selection (ACOGASS) graphical interface was used, using MATLAB version 5.3, to select the variables with the greatest discriminating power. The data set was divided into a validation set and a calibration set, using the Kennard-Stone (KS) algorithm. A combined method of Potential Functions (PF) was chosen together with Partial Least Square Linear Discriminant Analysis (PLS-DA). Different estimators of the predictive capacity of the model were compared, which were obtained using a decreasing number of groups, which implies more demanding validation conditions. The optimal number of latent variables was selected as the number associated with the minimum error and the smallest number of unassigned samples. Once the optimal number of latent variables was defined, we proceeded to apply the model to the training samples. With the calibrated model for the training samples, we proceeded to study the validation samples. The calibrated model that combines the potential function methods and PLSDA can be considered reliable and stable since its performance in future samples is expected to be comparable to that achieved for the training samples. By use of Potential Functions (PF) and Partial Least Square Linear Discriminant Analysis (PLS-DA) classification, authentic honey and honey adulterated with HFCS could be identified with a correct classification rate of 97.9%. The results showed that NIR in combination with the PT and PLS-DS methods can be a simple, fast and low-cost technique for the detection of HFCS in honey with high sensitivity and power of discrimination.Keywords: adulteration, multivariate analysis, potential functions, regression
Procedia PDF Downloads 125876 Stationary Methanol Steam Reforming to Hydrogen Fuel for Fuel-Cell Filling Stations
Authors: Athanasios A. Tountas, Geoffrey A. Ozin, Mohini M. Sain
Abstract:
Renewable hydrogen (H₂) carriers such as methanol (MeOH), dimethyl ether (DME), oxymethylene dimethyl ethers (OMEs), and conceivably ammonia (NH₃) can be reformed back into H₂ and are fundamental chemical conversions for the long-term viability of the H₂ economy due to their higher densities and ease of transportability compared to H₂. MeOH is an especially important carrier as it is a simple C1 chemical that can be produced from green solar-PV-generated H₂ and direct-air-captured CO₂ with a current commercially practical solar-to-fuel efficiency of 10% from renewable solar energy. MeOH steam reforming (MSR) in stationary systems next to H₂ fuel-cell filling stations can eliminate the need for onboard mobile reformers, and the former systems can be more robust in terms of attaining strict H₂ product specifications, and MeOH is a safe, lossless, and compact medium for long-term H₂ storage. Both thermal- and photo-catalysts are viable options for achieving the stable, long-term performance of stationary MSR systems.Keywords: fuel-cell vehicle filling stations, methanol steam reforming, hydrogen transport and storage, stationary reformer, liquid hydrogen carriers
Procedia PDF Downloads 102875 Simple and Scalable Thermal-Assisted Bar-Coating Process for Perovskite Solar Cell Fabrication in Open Atmosphere
Authors: Gizachew Belay Adugna
Abstract:
Perovskite solar cells (PSCs) shows rapid development as an emerging photovoltaic material; however, the fast device degradation due to the organic nature, mainly hole transporting material (HTM) and lack of robust and reliable upscaling process for photovoltaic module hindered its commercialization. Herein, HTM molecules with/without fluorine-substituted cyclopenta[2,1-b;3,4-b’]dithiophene derivatives (HYC-oF, HYC-mF, and HYC-H) were developed for PSCs application. The fluorinated HTM molecules exhibited better hole mobility and overall charge extraction in the devices mainly due to strong molecular interaction and packing in the film. Thus, the highest power conversion efficiency (PCE) of 19.64% with improved long stability was achieved for PSCs based on HYC-oF HTM. Moreover, the fluorinated HYC-oF demonstrated excellent film processability in a larger-area substrate (10 cm×10 cm) prepared sequentially with the absorption perovskite underlayer via a scalable bar coating process in ambient air and owned a higher PCE of 18.49% compared to the conventional spiro-OMeTAD (17.51%). The result demonstrates a facile development of HTM towards stable and efficient PSCs for future industrial-scale PV modules.Keywords: perovskite solar cells, upscaling film coating, power conversion efficiency, solution processing
Procedia PDF Downloads 73874 Leukocyte Detection Using Image Stitching and Color Overlapping Windows
Authors: Lina, Arlends Chris, Bagus Mulyawan, Agus B. Dharmawan
Abstract:
Blood cell analysis plays a significant role in the diagnosis of human health. As an alternative to the traditional technique conducted by laboratory technicians, this paper presents an automatic white blood cell (leukocyte) detection system using Image Stitching and Color Overlapping Windows. The advantage of this method is to present a detection technique of white blood cells that are robust to imperfect shapes of blood cells with various image qualities. The input for this application is images from a microscope-slide translation video. The preprocessing stage is performed by stitching the input images. First, the overlapping parts of the images are determined, then stitching and blending processes of two input images are performed. Next, the Color Overlapping Windows is performed for white blood cell detection which consists of color filtering, window candidate checking, window marking, finds window overlaps, and window cropping processes. Experimental results show that this method could achieve an average of 82.12% detection accuracy of the leukocyte images.Keywords: color overlapping windows, image stitching, leukocyte detection, white blood cell detection
Procedia PDF Downloads 310873 The Effectiveness of Environmental Policy Instruments for Promoting Renewable Energy Consumption: Command-and-Control Policies versus Market-Based Policies
Authors: Mahmoud Hassan
Abstract:
Understanding the impact of market- and non-market-based environmental policy instruments on renewable energy consumption (REC) is crucial for the design and choice of policy packages. This study aims to empirically investigate the effect of environmental policy stringency index (EPS) and its components on REC in 27 OECD countries over the period from 1990 to 2015, and then use the results to identify what the appropriate environmental policy mix should look like. By relying on the two-step system GMM estimator, we provide evidence that increasing environmental policy stringency as a whole promotes renewable energy consumption in these 27 developed economies. Moreover, policymakers are able, through the market- and non-market-based environmental policy instruments, to increase the use of renewable energy. However, not all of these instruments are effective for achieving this goal. The results indicate that R&D subsidies and trading schemes have a positive and significant impact on REC, while taxes, feed-in tariff and emission standards have not a significant effect. Furthermore, R&D subsidies are more effective than trading schemes for stimulating the use of clean energy. These findings proved to be robust across the three alternative panel techniques used.Keywords: environmental policy stringency, renewable energy consumption, two-step system-GMM estimation, linear dynamic panel data model
Procedia PDF Downloads 181872 Simulation with Uncertainties of Active Controlled Vibration Isolation System for Astronaut’s Exercise Platform
Authors: Shield B. Lin, Ziraguen O. Williams
Abstract:
In a task to assist NASA in analyzing the dynamic forces caused by operational countermeasures of an astronaut’s exercise platform impacting the spacecraft, an active proportional-integral-derivative controller commanding a linear actuator is proposed in a vibration isolation system to regulate the movement of the exercise platform. Computer simulation shows promising results that most exciter forces can be reduced or even eliminated. This paper emphasizes on parameter uncertainties, variations and exciter force variations. Drift and variations of system parameters in the vibration isolation system for astronaut’s exercise platform are analyzed. An active controlled scheme is applied with the goals to reduce the platform displacement and to minimize the force being transmitted to the spacecraft structure. The controller must be robust enough to accommodate the wide variations of system parameters and exciter forces. Computer simulation for the vibration isolation system was performed via MATLAB/Simulink and Trick. The simulation results demonstrate the achievement of force reduction with small platform displacement under wide ranges of variations in system parameters.Keywords: control, counterweight, isolation, vibration
Procedia PDF Downloads 146871 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances
Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels
Abstract:
The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.Keywords: prediction model, sensitivity analysis, simulation method, USMLE
Procedia PDF Downloads 339870 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading
Authors: Peter Shi
Abstract:
Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market
Procedia PDF Downloads 72869 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor
Authors: Jinseon Song, Yongwan Park
Abstract:
In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.Keywords: positioning, distance, camera, features, SURF(Speed-Up Robust Features), database, estimation
Procedia PDF Downloads 350868 Real Time Implementation of Efficient DFIG-Variable Speed Wind Turbine Control
Authors: Fayssal Amrane, Azeddine Chaiba, Bruno Francois
Abstract:
In this paper, design and experimental study based on Direct Power Control (DPC) of DFIG is proposed for Stand-alone mode in Variable Speed Wind Energy Conversion System (VS-WECS). The proposed IDPC method based on robust IP (Integral-Proportional) controllers in order to control the Rotor Side Converter (RSC) by the means of the rotor current d-q axes components (Ird* and Irq*) of Doubly Fed Induction Generator (DFIG) through AC-DC-AC converter. The implementation is realized using dSPACE dS1103 card under Sub and Super-synchronous operations (means < and > of the synchronous speed “1500 rpm”). Finally, experimental results demonstrate that the proposed control using IP provides improved dynamic responses, and decoupled control of the wind turbine has driven DFIG with high performances (good reference tracking, short response time and low power error) despite for sudden variation of wind speed and rotor references currents.Keywords: Direct Power Control (DPC), Doubly fed induction generator (DFIG), Wind Energy Conversion System (WECS), Experimental study.
Procedia PDF Downloads 126867 Soil Micromorphological Analysis from the Hinterland of the Pharaonic Town, Sai Island, Sudan
Authors: Sayantani Neogi, Sean Taylor, Julia Budka
Abstract:
This paper presents the results of the investigations of soil/sediment sequences associated with the New Kingdom town at Sai Island, Sudan. During the course of this study, geoarchaeological surveys have been undertaken in the vicinity of this Pharaonic town within the island and the soil block samples for soil micromorphological analysis were accordingly collected. The intention was to better understand the archaeological site in its environmental context and the nature of the land surface prior to the establishment of the settlement. Soil micromorphology, a very powerful geoarchaeological methodology, is concerned with the description, measurement and interpretation of soil components and pedological features at a microscopic scale. Since soil profiles themselves are archives of their own history, soil micromorphology investigates the environmental and cultural signatures preserved within buried soils and sediments. A study of the thin sections from these soils/sediments has been able to provide robust data for providing interesting insights into the various nuances of this site, for example, the nature of the topography and existent environmental condition during the time of Pharaonic site establishment. These geoarchaeological evaluations have indicated that there is a varied hidden landscape context for this pharaonic settlement, which indicates a symbiotic relationship with the Nilotic environmental system.Keywords: geoarchaeology, New Kingdom, Nilotic environment, soil micromorphology
Procedia PDF Downloads 264866 Attention-Based ResNet for Breast Cancer Classification
Authors: Abebe Mulugojam Negash, Yongbin Yu, Ekong Favour, Bekalu Nigus Dawit, Molla Woretaw Teshome, Aynalem Birtukan Yirga
Abstract:
Breast cancer remains a significant health concern, necessitating advancements in diagnostic methodologies. Addressing this, our paper confronts the notable challenges in breast cancer classification, particularly the imbalance in datasets and the constraints in the accuracy and interpretability of prevailing deep learning approaches. We proposed an attention-based residual neural network (ResNet), which effectively combines the robust features of ResNet with an advanced attention mechanism. Enhanced through strategic data augmentation and positive weight adjustments, this approach specifically targets the issue of data imbalance. The proposed model is tested on the BreakHis dataset and achieved accuracies of 99.00%, 99.04%, 98.67%, and 98.08% in different magnifications (40X, 100X, 200X, and 400X), respectively. We evaluated the performance by using different evaluation metrics such as precision, recall, and F1-Score and made comparisons with other state-of-the-art methods. Our experiments demonstrate that the proposed model outperforms existing approaches, achieving higher accuracy in breast cancer classification.Keywords: residual neural network, attention mechanism, positive weight, data augmentation
Procedia PDF Downloads 101865 The Truism of the True and Fair View of Auditor’s Report
Authors: Ofuan James Ilaboya, Okhae J. Ibhadode
Abstract:
The objective of this paper is to theoretically examine the truism of the “true and fair view” in the context of financial reporting. The paper examines the concepts such as true, fair, true and fair view, problems of true and fair view, the origin/history of true and fair view, review of attributes and key issues relating to true and fair view. The methodological approach adopted in this paper is library-based research, focusing on the review of relevant and related extant literature. The findings based on the review of relevant and related literature is suggestive of the fact that the true and fair concept in financial reporting environment is contentious. The study concludes that given the circumstances as chronicled on this paper, it is evident that the truism of the true and fair view of the auditor’s opinion is under serious threat. The way forward may be for the auditor to certify the accuracy and the correctness of the financial statement. While this position being canvassed here may help to substantially bridge the age-long expectation gap, it may as well require an upward review of the current audit fee structure in order to be able to operationalize the onerous task of certifying the accuracy and correctness of the financial statement. This position is contentious and will require a robust consideration which is not within the purview of the present review.Keywords: fiduciary duty, financial statement, true and correct, true and fair
Procedia PDF Downloads 135