Search results for: CCR Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16793

Search results for: CCR Model

6203 Scale, Technique and Composition Effects of CO2 Emissions under Trade Liberalization of EGS: A CGE Evaluation for Argentina

Authors: M. Priscila Ramos, Omar O. Chisari, Juan Pablo Vila Martínez

Abstract:

Current literature about trade liberalization of environmental goods and services (EGS) raises doubts about the extent of the triple win-win situation for trade, development and the environment. However, much of this literature does not consider the possibility that this agreement carries technological transmissions, either through trade or foreign direct investment. This paper presents a computable general equilibrium model calibrated for Argentina, where there are alternative technologies (one dirty and one clean according to carbon emissions) to produce the same goods. In this context, the trade liberalization of EGS allows to increase GDP, trade, reduce unemployment and improve the households welfare. However, the capital mobility appears as the key assumption to jointly reach the environmental target, when the positive scale effect generated by the increase in trade is offset by the change in the composition of production (composition and technical effects by the use of the clean alternative technology) and of consumption (composition effect by substitution of relatively lesspolluting imported goods).

Keywords: CGE modeling, CO2 emissions, composition effect, scale effect, technique effect, trade liberalization of EGS

Procedia PDF Downloads 380
6202 Prediction of the Thermal Parameters of a High-Temperature Metallurgical Reactor Using Inverse Heat Transfer

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents an inverse analysis for predicting the thermal conductivities and the heat flux of a high-temperature metallurgical reactor simultaneously. Once these thermal parameters are predicted, the time-varying thickness of the protective phase-change bank that covers the inside surface of the brick walls of a metallurgical reactor can be calculated. The enthalpy method is used to solve the melting/solidification process of the protective bank. The inverse model rests on the Levenberg-Marquardt Method (LMM) combined with the Broyden method (BM). A statistical analysis for the thermal parameter estimation is carried out. The effect of the position of the temperature sensors, total number of measurements and measurement noise on the accuracy of inverse predictions is investigated. Recommendations are made concerning the location of temperature sensors.

Keywords: inverse heat transfer, phase change, metallurgical reactor, Levenberg–Marquardt method, Broyden method, bank thickness

Procedia PDF Downloads 334
6201 Large-Scale Electroencephalogram Biometrics through Contrastive Learning

Authors: Mostafa ‘Neo’ Mohsenvand, Mohammad Rasool Izadi, Pattie Maes

Abstract:

EEG-based biometrics (user identification) has been explored on small datasets of no more than 157 subjects. Here we show that the accuracy of modern supervised methods falls rapidly as the number of users increases to a few thousand. Moreover, supervised methods require a large amount of labeled data for training which limits their applications in real-world scenarios where acquiring data for training should not take more than a few minutes. We show that using contrastive learning for pre-training, it is possible to maintain high accuracy on a dataset of 2130 subjects while only using a fraction of labels. We compare 5 different self-supervised tasks for pre-training of the encoder where our proposed method achieves the accuracy of 96.4%, improving the baseline supervised models by 22.75% and the competing self-supervised model by 3.93%. We also study the effects of the length of the signal and the number of channels on the accuracy of the user-identification models. Our results reveal that signals from temporal and frontal channels contain more identifying features compared to other channels.

Keywords: brainprint, contrastive learning, electroencephalo-gram, self-supervised learning, user identification

Procedia PDF Downloads 157
6200 Buck Boost Inverter to Improve the Efficiency and Performance of E-Motor by Reducing the Influence of Voltage Sag of Battery on the Performance of E-Motor

Authors: Shefeen Maliyakkal, Pranav Satheesh, Steve Simon, Sharath Kuruppath

Abstract:

This paper researches the impact of battery voltage sag on the performance and efficiency of E-motor in electric cars. Terminal voltage of battery reduces with the S.o.C. This results in the downward shift of torque-speed curve of E-motor and increased copper losses in E-motor. By introducing a buck-boost inverter between the battery and E-motor, an additional degree of freedom was achieved. By boosting the AC voltage, the dependency of voltage sag on the performance of E-motor was eliminated. A strategy was also proposed for the operation of the buck-boost inverter to minimize copper and iron losses in E-motor to maximize efficiency. MATLAB-SIMULINK model of E-drive was used to obtain simulation results. The temperature rise in the E-motor was reduced by 14% for a 10% increase in AC voltage. From the results, it was observed that a 20% increase in AC voltage can result in improvement of running torque and maximum torque of E-motor by 44%. Hence it was concluded that using a buck-boost inverter for E-drive significantly improves both performance and efficiency of E-motor.

Keywords: buck-boost, E-motor, battery, voltage sag

Procedia PDF Downloads 399
6199 Role of Agricultural Journalism in Diffusion of Farming Technologies

Authors: Muhammad Luqman, Mujahid Karim

Abstract:

Agricultural journalism considered an effective tool in the diffusion of agricultural technologies among the members of farming communities. Various agricultural journalism forms are used by the different organization in order to address the community problems and provide solutions to them. The present study was conducted for analyzing the role of agricultural journalism in the dissemination of agricultural information. The universe of the study was district Sargodha from which a sample of 100 was collected through a validating and pre-tested questionnaire. Statistical analysis of collected data was done with the help of SPSS. It was concluded that majority (64.6%) of the respondent were middle-aged (31-50) years, also indicates a high (73.23%) literacy rate above middle-level education, most (78.3%) of the respondents were connected with the occupation of farming. In various forms of agricultural journalism “Radio/T.V./F.M) is used by 99.4% of the respondent, Mobile phones (96%), Magazine/ Newspaper/ periodical (66.4%) and social media (60.9%). Regarding majors areas focused on agriculture journalism “Help farmers to enhance their productivity is on the highest level with a mean of ( =3.98/5.00). The regression model of farmer's education and various forms of agricultural journalism facilities used was found to be significant.

Keywords: agricultural information, journalism, farming community, technology diffusion and adoption

Procedia PDF Downloads 195
6198 A Comprehensive Study of Camouflaged Object Detection Using Deep Learning

Authors: Khalak Bin Khair, Saqib Jahir, Mohammed Ibrahim, Fahad Bin, Debajyoti Karmaker

Abstract:

Object detection is a computer technology that deals with searching through digital images and videos for occurrences of semantic elements of a particular class. It is associated with image processing and computer vision. On top of object detection, we detect camouflage objects within an image using Deep Learning techniques. Deep learning may be a subset of machine learning that's essentially a three-layer neural network Over 6500 images that possess camouflage properties are gathered from various internet sources and divided into 4 categories to compare the result. Those images are labeled and then trained and tested using vgg16 architecture on the jupyter notebook using the TensorFlow platform. The architecture is further customized using Transfer Learning. Methods for transferring information from one or more of these source tasks to increase learning in a related target task are created through transfer learning. The purpose of this transfer of learning methodologies is to aid in the evolution of machine learning to the point where it is as efficient as human learning.

Keywords: deep learning, transfer learning, TensorFlow, camouflage, object detection, architecture, accuracy, model, VGG16

Procedia PDF Downloads 149
6197 The Economic Implications of Cryptocurrency and Its Potential to Disrupt Traditional Financial Systems as a Store of Value

Authors: G. L. Rithika, Arvind B. S., Akash R., Ananda Vinayak, Hema M. S.

Abstract:

Cryptocurrencies were first launched in the year 2009 and have been a great asset to own. Cryptocurrencies are a representation of a completely distinct decentralization model for money. They also contribute to the elimination of currency monopolies and the liberation of money from control. The fact that no government agency can determine a coin's value or flow is what cryptocurrency advocates believe makes them safe and secure. The aim of this paper is to analyze the economic implications of cryptocurrency and how it would disrupt traditional financial systems. This paper analyses the growth of Cryptocurrency over the years and the potential threats of cryptocurrency to financial systems. Our analysis shows that although the DeFi design, like the traditional financial system, may have the ability to lower transaction costs, there are multiple layers where rents might build up because of endogenous competition limitations. The permissionless and anonymous design of DeFi poses issues for ensuring tax compliance, anti-money laundering laws and regulations, and preventing financial misconduct.

Keywords: cryptocurrencies, bitcoin, blockchain technology, traditional financial systems, decentralisation, regulatory framework

Procedia PDF Downloads 50
6196 Eco-Friendly Synthesis of Carbon Quantum Dots as an Effective Adsorbent

Authors: Hebat‑Allah S. Tohamy, Mohamed El‑Sakhawy, Samir Kamel

Abstract:

Fluorescent carbon quantum dots (CQDs) were prepared by an economical, green, and single-step procedure based on microwave heating of urea with sugarcane bagasse (SCB), cellulose (C), or carboxymethyl cellulose (CMC). The prepared CQDs were characterized using a series of spectroscopic techniques, and they had small size, strong absorption in the UV, and excitation wavelength-dependent fluorescence. The prepared CQDs were used for Pb(II) adsorption from an aqueous solution. The removal efficiency percentages (R %) were 99.16, 96.36, and 98.48 for QCMC, QC, and QSCB. The findings validated the efficiency of CQDs synthesized from CMC, cellulose, and SCB as excellent materials for further utilization in the environmental fields of wastewater pollution detection, adsorption, and chemical sensing applications. The kinetics and isotherms studied found that all CQD isotherms fit well with the Langmuir model than Freundlich and Temkin models. According to R², the pseudo-second-order fits the adsorption of QCMC, while the first-order one fits with QC and QSCB.

Keywords: carbon quantum dots, graphene quantum dots, fluorescence, quantum yield, water treatment, agricultural wastes

Procedia PDF Downloads 132
6195 Role of GM1 in the Interaction between Amyloid Prefibrillar Oligomers of Salmon Calcitonin and Model Membranes

Authors: Cristiano Giordani, Marco Diociaiuti, Cecilia Bombelli, Laura Zanetti-Polzi, Marcello Belfiore, Raoul Fioravanti, Gianfranco Macchia

Abstract:

We investigated induced functional effects by evaluating Ca2+-influx in liposomes and cell viability in HT22-DIFF neurons. Only solutions rich in unstructured Prefibrillar-Oligomers (PFOs) were able, in the presence of Monosialoganglioside-GM1 (GM1), to induce Ca2+-influx and were also neurotoxic, suggesting a correlation between the two phenomena. Thus, in the presence of GM1, we investigated the protein conformation and liposome modification due to the interaction. Circular Dichroism showed that GM1 fostered the formation of β-structures and Energy Filtered-Transmission Electron Microscopy that PFOs formed “amyloid-channels” as reported for Aβ. We speculate that electrostatic forces occurring between the positive PFOs and negative GM1 drive the initial binding, while the hydrophobic profile of the flexible PFO is responsible for the subsequent pore formation. Conversely, the rigid β-structured mature/fibers (MFs) and proto-fibers (PFs) were unable to induce membrane damage and Ca2+- influx.

Keywords: amyloid proteins, neurotoxicity, lipid-rafts, GM1

Procedia PDF Downloads 189
6194 Performance of Neural Networks vs. Radial Basis Functions When Forming a Metamodel for Residential Buildings

Authors: Philip Symonds, Jon Taylor, Zaid Chalabi, Michael Davies

Abstract:

With the world climate projected to warm and major cities in developing countries becoming increasingly populated and polluted, governments are tasked with the problem of overheating and air quality in residential buildings. This paper presents the development of an adaptable model of these risks. Simulations are performed using the EnergyPlus building physics software. An accurate metamodel is formed by randomly sampling building input parameters and training on the outputs of EnergyPlus simulations. Metamodels are used to vastly reduce the amount of computation time required when performing optimisation and sensitivity analyses. Neural Networks (NNs) are compared to a Radial Basis Function (RBF) algorithm when forming a metamodel. These techniques were implemented using the PyBrain and scikit-learn python libraries, respectively. NNs are shown to perform around 15% better than RBFs when estimating overheating and air pollution metrics modelled by EnergyPlus.

Keywords: neural networks, radial basis functions, metamodelling, python machine learning libraries

Procedia PDF Downloads 447
6193 The Behavior of Dam Foundation Reinforced by Stone Columns: Case Study of Kissir Dam-Jijel

Authors: Toufik Karech, Abderahmen Benseghir, Tayeb Bouzid

Abstract:

This work presents a 2D numerical simulation of an earth dam to assess the behavior of its foundation after a treatment by stone columns. This treatment aims to improve the bearing capacity, to increase the mechanical properties of the soil, to accelerate the consolidation, to reduce the settlements and to eliminate the liquefaction phenomenon in case of seismic excitation. For the evaluation of the pore pressures, the position of the phreatic line and the flow network was defined, and a seepage analysis was performed with the software MIDAS Soil Works. The consolidation calculation is performed through a simulation of the actual construction stages of the dam. These analyzes were performed using the Mohr-Coulomb soil model and the results are compared with the actual measurements of settlement gauges implanted in the dam. An analysis of the bearing capacity was conducted to show the role of stone columns in improving the bearing capacity of the foundation.

Keywords: earth dam, dam foundation, numerical simulation, stone columns, seepage analysis, consolidation, bearing capacity

Procedia PDF Downloads 190
6192 Annotation Ontology for Semantic Web Development

Authors: Hadeel Al Obaidy, Amani Al Heela

Abstract:

The main purpose of this paper is to examine the concept of semantic web and the role that ontology and semantic annotation plays in the development of semantic web services. The paper focuses on semantic web infrastructure illustrating how ontology and annotation work to provide the learning capabilities for building content semantically. To improve productivity and quality of software, the paper applies approaches, notations and techniques offered by software engineering. It proposes a conceptual model to develop semantic web services for the infrastructure of web information retrieval system of digital libraries. The developed system uses ontology and annotation to build a knowledge based system to define and link the meaning of a web content to retrieve information for users’ queries. The results are more relevant through keywords and ontology rule expansion that will be more accurate to satisfy the requested information. The level of results accuracy would be enhanced since the query semantically analyzed work with the conceptual architecture of the proposed system.

Keywords: semantic web services, software engineering, semantic library, knowledge representation, ontology

Procedia PDF Downloads 173
6191 Iterative Design Process for Development and Virtual Commissioning of Plant Control Software

Authors: Thorsten Prante, Robert Schöch, Ruth Fleisch, Vaheh Khachatouri, Alexander Walch

Abstract:

The development of industrial plant control software is a complex and often very expensive task. One of the core problems is that a lot of the implementation and adaptation work can only be done after the plant hardware has been installed. In this paper, we present our approach to virtually developing and validating plant-level control software of production plants. This way, plant control software can be virtually commissioned before actual ramp-up of a plant, reducing actual commissioning costs and time. Technically, this is achieved by linking the actual plant-wide process control software (often called plant server) and an elaborate virtual plant model together to form an emulation system. Method-wise, we are suggesting a four-step iterative process with well-defined increments and time frame. Our work is based on practical experiences from planning to commissioning and start-up of several cut-to-size plants.

Keywords: iterative system design, virtual plant engineering, plant control software, simulation and emulation, virtual commissioning

Procedia PDF Downloads 490
6190 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data

Authors: Hyun-Woo Cho

Abstract:

It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.

Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring

Procedia PDF Downloads 243
6189 Influence of Water Hardness on Column Adsorption of Paracetamol by Biomass of Babassu Coconut Shell

Authors: O. M. Couto Junior, I. Matos, I. M. Fonseca, P. A. Arroyo, E. A. Silva, M. A. S. D. Barros

Abstract:

This study was the adsorption of paracetamol from aqueous solutions on fixed beds of activated carbon from babassy coconut shell. Several operation conditions on the shape of breakthrough curves were investigated and proposed model is successfully validated with the literature data and obtained experimental data. The initial paracetamol concentration increases from 20 to 50 mg.L-1, and the break point time decreases, tb, from 18.00 to 10.50 hours. The fraction of unused bed length, HUNB, at break-through point is obtained in the range of 1.62 to 2.81 for 20 to 50 mg.L-1 of initial paracetamol concentration. The presence of Ca+2 and Mg+2 are responsible for increasing the hardness of the water, affects significantly the adsorption kinetics, and lower removal efficiency by adsorption of paracetamol on activated carbons. The axial dispersion coefficients, DL, was constants for concentrated feed solution, but this parameter has different values for deionized and hardness water. The mass transfer coefficient, Ks, was increasing with concentrated feed solution.

Keywords: paracetamol, adsorption, water hardness, activated carbon.

Procedia PDF Downloads 320
6188 A Review of BIM Applications for Heritage and Historic Buildings: Challenges and Solutions

Authors: Reza Yadollahi, Arash Hejazi, Dante Savasta

Abstract:

Building Information Modeling (BIM) is growing so fast in construction projects around the world. Considering BIM's weaknesses in implementing existing heritage and historical buildings, it is critical to facilitate BIM application for such structures. One of the pieces of information to build a model in BIM is to import material and its characteristics. Material library is essential to speed up the entry of project information. To save time and prevent cost overrun, a BIM object material library should be provided. However, historical buildings' lack of information and documents is typically a challenge in renovation and retrofitting projects. Due to the lack of case documents for historic buildings, importing data is a time-consuming task, which can be improved by creating BIM libraries. Based on previous research, this paper reviews the complexities and challenges in BIM modeling for heritage, historic, and architectural buildings. Through identifying the strengths and weaknesses of the standard BIM systems, recommendations are provided to enhance the modeling platform.

Keywords: building Information modeling, historic, heritage buildings, material library

Procedia PDF Downloads 117
6187 Evaluation and Analysis of Light Emitting Diode Distribution in an Indoor Visible Light Communication

Authors: Olawale J. Olaluyi, Ayodele S. Oluwole, O. Akinsanmi, Johnson O. Adeogo

Abstract:

Communication using visible light VLC is considered a cutting-edge technology used for data transmission and illumination since it uses less energy than radio frequency (RF) technology and has a large bandwidth, extended lifespan, and high security. The room's irregular distribution of small base stations, or LED array distribution, is the cause of the obscured area, minimum signal-to-noise ratio (SNR), and received power. In order to maximize the received power distribution and SNR at the center of the room for an indoor VLC system, the researchers offer an innovative model for the placement of eight LED array distributions in this work. We have investigated the arrangement of the LED array distribution with regard to receiving power to fill the open space in the center of the room. The suggested LED array distribution saved 36.2% of the transmitted power, according to the simulation findings. Aside from that, the entire room was equally covered. This leads to an increase in both received power and SNR.

Keywords: visible light communication (VLC), light emitted diodes (LED), optical power distribution, signal-to-noise ratio (SNR).

Procedia PDF Downloads 89
6186 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 432
6185 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity

Authors: Mujtaba Roshan, John A. Schormans

Abstract:

Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.

Keywords: network capacity, packet loss probability, quality of experience, quality of service

Procedia PDF Downloads 273
6184 A Study on Good Governance: Its Elements, Models, and Goals

Authors: Ehsan Daryadel, Hamid Shakeri

Abstract:

Good governance is considered as one of the necessary prerequisites for promotion of sustainable development programs in countries. Theoretical model of good governance is going to form the best methods for administration and management of subject country. The importance of maintaining the balance between the needs of present and future generation through sustainable development caused a change in method of management and providing service for citizens that is addressed as the most efficient and effective way of administration of countries. This method is based on democratic and equal-seeking sustainable development which is trying to affect all actors in this area and also be accountable to all citizens’ needs. Meanwhile, it should be noted that good governance is a prerequisite for sustainable development. In fact, good governance means impact of all actors on administration and management of the country for fulfilling public services, general needs of citizens and establishing a balance and harmony between needs of present and future generation. In the present study, efforts have been made to present concepts, definitions, purposes and indices of good governance with a descriptive-analytical method.

Keywords: accountability, efficiency and effectiveness, good governance, rule of law, transparency

Procedia PDF Downloads 303
6183 Model Studies on Use of Coal Mine Waste and Modified Clay Soil as Fill Material for Embankments and Foundations

Authors: K. Suresh, M. Padmavathi, N. Darga Kumar

Abstract:

The objective of this study is to investigate the significance of coal mine waste and improved clay soil when used as a fill and for the construction of embankment. To determine the bearing capacities of coal mine waste and improved clay soil apart from laboratory tests, PLAXIS 2D software is used to make the analysis simpler. Depending upon the bearing capacities obtained for different cases, the conclusion can be obtained. Load carrying capacities are determined for coal mine waste, clay and by altering their height ratio when clay (H2) is at the bottom and coal mine waste (H1) is on the top with three different cases (H = 0.25H1 + 0.75H2, 0.5H1 + 0.5H2, 0.75H1 + 0.25H2) in addition to this bearing capacity of improved clay soil (by replacing clay with 10% CMW, 30% CMW and 50% CMW, in addition, Polycom) is also determined. The safe height of the embankment that can be constructed with the improved clay for different slopes, i.e., for 1:1, 1.5: 1, 2: 1, is also determined by using PLAXIS 2D software by limiting the Factor of safety to 1.5.

Keywords: cohesion, angle of shearing resistance, elastic modulus, coefficient of consolidation, coal mine waste

Procedia PDF Downloads 15
6182 Using the M-Learning to Support Learning of the Concept of the Derivative

Authors: Elena F. Ruiz, Marina Vicario, Chadwick Carreto, Rubén Peredo

Abstract:

One of the main obstacles in Mexico’s engineering programs is math comprehension, especially in the Derivative concept. Due to this, we present a study case that relates Mobile Computing and Classroom Learning in the “Escuela Superior de Cómputo”, based on the Educational model of the Instituto Politécnico Nacional (competence based work and problem solutions) in which we propose apps and activities to teach the concept of the Derivative. M- Learning is emphasized as one of its lines, as the objective is the use of mobile devices running an app that uses its components such as sensors, screen, camera and processing power in classroom work. In this paper, we employed Augmented Reality (ARRoC), based on the good results this technology has had in the field of learning. This proposal was developed using a qualitative research methodology supported by quantitative research. The methodological instruments used on this proposal are: observation, questionnaires, interviews and evaluations. We obtained positive results with a 40% increase using M-Learning, from the 20% increase using traditional means.

Keywords: augmented reality, classroom learning, educational research, mobile computing

Procedia PDF Downloads 360
6181 Effect of Two Cooking Methods on Kinetics of Polyphenol Content, Flavonoid Content and Color of a Tunisian Meal: Molokheiya (Corchorus olitorius)

Authors: S. Njoumi, L. Ben Haj Said, M. J. Amiot, S. Bellagha

Abstract:

The main objective of this research was to establish the kinetics of variation of total polyphenol content (TPC) and total flavonoid content (TFC) in Tunisian Corchorus olitorius powder and in a traditional home cooked-meal (Molokheiya) when using stewing and stir-frying as cooking methods, but also to compare the effect of these two common cooking practices on water content, TPC, TFC and color. The L*, a* and b* coordinates values of the Molokheiya varied from 24.955±0.039 to 21.301±0.036, from -1.556±0.048 to 0.23±0.026 and from 5.675±0.052 to 6.313±0.103 when using stewing and from 21.328±0.025 to 20.56±0.021, from -1.093± 0.011to 0.121±0.007 and from 5.708±0.020 to 6.263±0.007 when using stir-frying, respectively. TPC and TFC increased during cooking. TPC of Molokheiya varied from 29.852±0.866 mg GAE/100 g to 220.416±0.519 mg GAE/100 g after 150 min of stewing and from 25.257±0.259 mg GAE/100 g to 208.897 ±0.173 mg GAE/100 g using stir-frying method during 150 min. TFC of Molokheiya varied from 48.229±1.47 mg QE/100 g to 843.802±1.841 mg QE/100 g when using stewing and from 37.031± 0.368 mg QE/100 g to 775.312±0.736 mg QE/100 g when using stir-frying. Kinetics followed similar curves in all cases but resulted in different final TPC and TFC. The shape of the kinetics curves suggests zero-order kinetics. The mathematical relations and the numerical approach used to model the kinetics of polyphenol and flavonoid contents in Molokheiya are described.

Keywords: Corchorus olitorius, Molokheiya, phenolic compounds, kinetic

Procedia PDF Downloads 355
6180 Development of Tensile Stress-Strain Relationship for High-Strength Steel Fiber Reinforced Concrete

Authors: H. A. Alguhi, W. A. Elsaigh

Abstract:

This paper provides a tensile stress-strain (σ-ε) relationship for High-Strength Steel Fiber Reinforced Concrete (HSFRC). Load-deflection (P-δ) behavior of HSFRC beams tested under four-point flexural load were used with inverse analysis to calculate the tensile σ-ε relationship for various tested concrete grades (70 and 90MPa) containing 60 kg/m3 (0.76 %) of hook-end steel fibers. A first estimate of the tensile (σ-ε) relationship is obtained using RILEM TC 162-TDF and other methods available in literature, frequently used for determining tensile σ-ε relationship of Normal-Strength Concrete (NSC) Non-Linear Finite Element Analysis (NLFEA) package ABAQUS® is used to model the beam’s P-δ behavior. The results have shown that an element-size dependent tensile σ-ε relationship for HSFRC can be successfully generated and adopted for further analyzes involving HSFRC structures.

Keywords: tensile stress-strain, flexural response, high strength concrete, steel fibers, non-linear finite element analysis

Procedia PDF Downloads 360
6179 Modelling of Powered Roof Supports Work

Authors: Marcin Michalak

Abstract:

Due to the increasing efforts on saving our natural environment a change in the structure of energy resources can be observed - an increasing fraction of a renewable energy sources. In many countries traditional underground coal mining loses its significance but there are still countries, like Poland or Germany, in which the coal based technologies have the greatest fraction in a total energy production. This necessitates to make an effort to limit the costs and negative effects of underground coal mining. The longwall complex is as essential part of the underground coal mining. The safety and the effectiveness of the work is strongly dependent of the diagnostic state of powered roof supports. The building of a useful and reliable diagnostic system requires a lot of data. As the acquisition of a data of any possible operating conditions it is important to have a possibility to generate a demanded artificial working characteristics. In this paper a new approach of modelling a leg pressure in the single unit of powered roof support. The model is a result of the analysis of a typical working cycles.

Keywords: machine modelling, underground mining, coal mining, structure

Procedia PDF Downloads 368
6178 Review of the Software Used for 3D Volumetric Reconstruction of the Liver

Authors: P. Strakos, M. Jaros, T. Karasek, T. Kozubek, P. Vavra, T. Jonszta

Abstract:

In medical imaging, segmentation of different areas of human body like bones, organs, tissues, etc. is an important issue. Image segmentation allows isolating the object of interest for further processing that can lead for example to 3D model reconstruction of whole organs. Difficulty of this procedure varies from trivial for bones to quite difficult for organs like liver. The liver is being considered as one of the most difficult human body organ to segment. It is mainly for its complexity, shape versatility and proximity of other organs and tissues. Due to this facts usually substantial user effort has to be applied to obtain satisfactory results of the image segmentation. Process of image segmentation then deteriorates from automatic or semi-automatic to fairly manual one. In this paper, overview of selected available software applications that can handle semi-automatic image segmentation with further 3D volume reconstruction of human liver is presented. The applications are being evaluated based on the segmentation results of several consecutive DICOM images covering the abdominal area of the human body.

Keywords: image segmentation, semi-automatic, software, 3D volumetric reconstruction

Procedia PDF Downloads 290
6177 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 242
6176 Sparse-View CT Reconstruction Based on Nonconvex L1 − L2 Regularizations

Authors: Ali Pour Yazdanpanah, Farideh Foroozandeh Shahraki, Emma Regentova

Abstract:

The reconstruction from sparse-view projections is one of important problems in computed tomography (CT) limited by the availability or feasibility of obtaining of a large number of projections. Traditionally, convex regularizers have been exploited to improve the reconstruction quality in sparse-view CT, and the convex constraint in those problems leads to an easy optimization process. However, convex regularizers often result in a biased approximation and inaccurate reconstruction in CT problems. Here, we present a nonconvex, Lipschitz continuous and non-smooth regularization model. The CT reconstruction is formulated as a nonconvex constrained L1 − L2 minimization problem and solved through a difference of convex algorithm and alternating direction of multiplier method which generates a better result than L0 or L1 regularizers in the CT reconstruction. We compare our method with previously reported high performance methods which use convex regularizers such as TV, wavelet, curvelet, and curvelet+TV (CTV) on the test phantom images. The results show that there are benefits in using the nonconvex regularizer in the sparse-view CT reconstruction.

Keywords: computed tomography, non-convex, sparse-view reconstruction, L1-L2 minimization, difference of convex functions

Procedia PDF Downloads 316
6175 Effect of Kenaf Fibres on Starch-Grafted-Polypropylene Biopolymer Properties

Authors: Amel Hamma, Allesandro Pegoretti

Abstract:

Kenaf fibres, with two aspect ratios, were melt compounded with two types of biopolymers named starch grafted polypropylene, and then blends compression molded to form plates of 1 mm thick. Results showed that processing induced variation of fibres length which is quantified by optical microscopy observations. Young modulus, stress at break and impact resistance values of starch-grafted-polypropylenes were remarkably improved by kenaf fibres for both matrixes and demonstrated best values when G906PJ were used as matrix. These results attest the good interfacial bonding between the matrix and fibres even in the absence of any interfacial modification. Vicat Softening Point and storage modules were also improved due to the reinforcing effect of fibres. Moreover, short-term tensile creep tests have proven that kenaf fibres remarkably improve the creep stability of composites. The creep behavior of the investigated materials was successfully modeled by the four parameters Burgers model.

Keywords: creep behaviour, kenaf fibres, mechanical properties, starch-grafted-polypropylene

Procedia PDF Downloads 232
6174 Application of Discrete-Event Simulation in Health Technology Assessment: A Cost-Effectiveness Analysis of Alzheimer’s Disease Treatment Using Real-World Evidence in Thailand

Authors: Khachen Kongpakwattana, Nathorn Chaiyakunapruk

Abstract:

Background: Decision-analytic models for Alzheimer’s disease (AD) have been advanced to discrete-event simulation (DES), in which individual-level modelling of disease progression across continuous severity spectra and incorporation of key parameters such as treatment persistence into the model become feasible. This study aimed to apply the DES to perform a cost-effectiveness analysis of treatment for AD in Thailand. Methods: A dataset of Thai patients with AD, representing unique demographic and clinical characteristics, was bootstrapped to generate a baseline cohort of patients. Each patient was cloned and assigned to donepezil, galantamine, rivastigmine, memantine or no treatment. Throughout the simulation period, the model randomly assigned each patient to discrete events including hospital visits, treatment discontinuation and death. Correlated changes in cognitive and behavioral status over time were developed using patient-level data. Treatment effects were obtained from the most recent network meta-analysis. Treatment persistence, mortality and predictive equations for functional status, costs (Thai baht (THB) in 2017) and quality-adjusted life year (QALY) were derived from country-specific real-world data. The time horizon was 10 years, with a discount rate of 3% per annum. Cost-effectiveness was evaluated based on the willingness-to-pay (WTP) threshold of 160,000 THB/QALY gained (4,994 US$/QALY gained) in Thailand. Results: Under a societal perspective, only was the prescription of donepezil to AD patients with all disease-severity levels found to be cost-effective. Compared to untreated patients, although the patients receiving donepezil incurred a discounted additional costs of 2,161 THB, they experienced a discounted gain in QALY of 0.021, resulting in an incremental cost-effectiveness ratio (ICER) of 138,524 THB/QALY (4,062 US$/QALY). Besides, providing early treatment with donepezil to mild AD patients further reduced the ICER to 61,652 THB/QALY (1,808 US$/QALY). However, the dominance of donepezil appeared to wane when delayed treatment was given to a subgroup of moderate and severe AD patients [ICER: 284,388 THB/QALY (8,340 US$/QALY)]. Introduction of a treatment stopping rule when the Mini-Mental State Exam (MMSE) score goes below 10 to a mild AD cohort did not deteriorate the cost-effectiveness of donepezil at the current treatment persistence level. On the other hand, none of the AD medications was cost-effective when being considered under a healthcare perspective. Conclusions: The DES greatly enhances real-world representativeness of decision-analytic models for AD. Under a societal perspective, treatment with donepezil improves patient’s quality of life and is considered cost-effective when used to treat AD patients with all disease-severity levels in Thailand. The optimal treatment benefits are observed when donepezil is prescribed since the early course of AD. With healthcare budget constraints in Thailand, the implementation of donepezil coverage may be most likely possible when being considered starting with mild AD patients, along with the stopping rule introduced.

Keywords: Alzheimer's disease, cost-effectiveness analysis, discrete event simulation, health technology assessment

Procedia PDF Downloads 129