Search results for: step leaching
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3130

Search results for: step leaching

2140 The Effect of Corporate Governance on Earnings Management: When Firms Report Increasing Earnings

Authors: Su-Ping Liu, Yue Tian, Yifan Shen

Abstract:

This study investigates the effect of corporate governance on earnings management when firms have reported a long stream of earnings increases (hereafter referred to as earnings beaters). We expect that good quality of corporate governance decreases the probability of income-increasing earnings management. We employ transparent tools to capture firms’ opportunistic management behavior, specifically, the repurchase of stock. In addition, we use corporate governance proxies to measure the degree of corporate governance, including board size, board independence, CEO duality, and the frequency of meeting. The results hold after the controlling of variables that suggested in prior literature. We expect that the simple technique, that is, firms’ degree of corporate governance, to be used as an inexpensive first step in detecting earnings management.

Keywords: corporate governance, earnings management, earnings patterns, stock repurchase

Procedia PDF Downloads 177
2139 Efficient Subgoal Discovery for Hierarchical Reinforcement Learning Using Local Computations

Authors: Adrian Millea

Abstract:

In hierarchical reinforcement learning, one of the main issues encountered is the discovery of subgoal states or options (which are policies reaching subgoal states) by partitioning the environment in a meaningful way. This partitioning usually requires an expensive global clustering operation or eigendecomposition of the Laplacian of the states graph. We propose a local solution to this issue, much more efficient than algorithms using global information, which successfully discovers subgoal states by computing a simple function, which we call heterogeneity for each state as a function of its neighbors. Moreover, we construct a value function using the difference in heterogeneity from one step to the next, as reward, such that we are able to explore the state space much more efficiently than say epsilon-greedy. The same principle can then be applied to higher level of the hierarchy, where now states are subgoals discovered at the level below.

Keywords: exploration, hierarchical reinforcement learning, locality, options, value functions

Procedia PDF Downloads 171
2138 From Two-Way to Multi-Way: A Comparative Study for Map-Reduce Join Algorithms

Authors: Marwa Hussien Mohamed, Mohamed Helmy Khafagy

Abstract:

Map-Reduce is a programming model which is widely used to extract valuable information from enormous volumes of data. Map-reduce designed to support heterogeneous datasets. Apache Hadoop map-reduce used extensively to uncover hidden pattern like data mining, SQL, etc. The most important operation for data analysis is joining operation. But, map-reduce framework does not directly support join algorithm. This paper explains and compares two-way and multi-way map-reduce join algorithms for map reduce also we implement MR join Algorithms and show the performance of each phase in MR join algorithms. Our experimental results show that map side join and map merge join in two-way join algorithms has the longest time according to preprocessing step sorting data and reduce side cascade join has the longest time at Multi-Way join algorithms.

Keywords: Hadoop, MapReduce, multi-way join, two-way join, Ubuntu

Procedia PDF Downloads 487
2137 Energy Consumption Models for Electric Vehicles: Survey and Proposal of a More Realistic Model

Authors: I. Sagaama, A. Kechiche, W. Trojet, F. Kamoun

Abstract:

Replacing combustion engine vehicles by electric vehicles (EVs) is a major step in recent years due to their potential benefits. Battery autonomy and charging processes are still a big issue for that kind of vehicles. Therefore, reducing the energy consumption of electric vehicles becomes a necessity. Many researches target introducing recent information and communication technologies in EVs in order to propose reducing energy consumption services. Evaluation of realistic scenarios is a big challenge nowadays. In this paper, we will elaborate a state of the art of different proposed energy consumption models in the literature, then we will present a comparative study of these models, finally, we will extend previous works in order to propose an accurate and realistic energy model for calculating instantaneous power consumption of electric vehicles.

Keywords: electric vehicle, vehicular networks, energy models, traffic simulation

Procedia PDF Downloads 370
2136 Subsidiary Strategy and Importance of Standards: Re-Interpreting the Integration-Responsiveness Framework

Authors: Jo-Ann Müller

Abstract:

The integration-responsiveness (IR) framework presents four distinct internationalization strategies which differ depending on the extent of pressure the company faces for local responsiveness and global integration. This study applies the framework to standards by examining differences in the relative importance of three types of standards depending on the role the subsidiary plays within the corporate group. Hypotheses are tested empirically in a two-stage procedure. First, the subsidiaries are grouped performing cluster analysis. In the second step, the relationship between cluster affiliation and subsidiary strategy is tested using multinomial Probit estimation. While the level of local responsiveness of a firm relates to the relative importance of national and international formal standards, the degree of vertical integration is associated with the application of internal company.

Keywords: FDI, firm-level data, standards, subsidiary strategy

Procedia PDF Downloads 286
2135 From Social Equity to Spatial Equity in Urban Space: Precedent Study Approach

Authors: Dorsa Pourmojib, Marc J. Boutin

Abstract:

Urban space is used everyday by a diverse range of urban dwellers, each with different expectations. In this space, opportunities and resources are not distributed equitably among urban dwellers, despite the importance of inclusivity. In addition, some marginalized groups may not be considered. These include people with low incomes, immigrants from diverse cultures, various age groups, and those with special needs. To this end, this research aims to enhance social equity in urban space by bridging the gap between social equity and spatial equity in the urban context. This gap in the knowledge base related to urban design may be present for several reasons; lack of studies on relationship between social equity and spatial equity in urban open space, lack of practical design strategies for promoting social equity in urban open space, lack of proper site analysis in terms of context and users of the site both for designing new urban open spaces and developing the existing ones, and lack of researchers that are designers and finally it could be related to priorities of the city’s policies in addressing such issues, since it is time, money and energy consuming. The main objective of this project is addressing the aforementioned gap in the knowledge by exploring the relationship between social equity and spatial equity in urban open space. Answering the main question of this research is a promising step to this end; 'What are the considerations towards providing social equity through the design of urban elements that offer spatial equity?' To answer the main question of this research there are several secondary questions which should be addressed. Such as; how can the characteristics of social equity be translated to spatial equity? What are the diverse user’s needs and which of their needs are not considered in that site? What are the specific elements in the site which should be designed in order to promote social equity? What is the current situation of social and spatial equity in the proposed site? To answer the research questions and achieve the proposed objectives, a three-step methodology has been implemented. Firstly, a comprehensive research framework based on the available literature has been presented. Afterwards, three different urban spaces have been analyzed in terms of specific key research questions as the precedent studies; Naqsh-e Jahan Square (Iran), Superkilen Park (Denmark) and Campo Dei Fiori (Italy). In this regard, a proper gap analysis of the current situation and the proposed situation of these sites has been conducted. Finally, by combining the extracted design considerations from the precedent studies and the literature review, practical design strategies have been introduced as a result of this research. The presented guidelines enable the designers to create socially equitable urban spaces. To conclude, this research proposes a spatial approach to social inclusion and equity in urban space by presenting a practical framework and criteria for translating social equity to spatial equity in urban areas.

Keywords: inclusive urban design, social equity, social inclusion, spatial equity

Procedia PDF Downloads 143
2134 Proximal Method of Solving Split System of Minimization Problem

Authors: Anteneh Getachew Gebrie, Rabian Wangkeeree

Abstract:

The purpose of this paper is to introduce iterative algorithm solving split system of minimization problem given as a task of finding a common minimizer point of finite family of proper, lower semicontinuous convex functions and whose image under a bounded linear operator is also common minimizer point of another finite family of proper, lower semicontinuous convex functions. We obtain strong convergence of the sequence generated by our algorithm under some suitable conditions on the parameters. The iterative schemes are developed with a way of selecting the step sizes such that the information of operator norm is not necessary. Some applications and numerical experiment is given to analyse the efficiency of our algorithm.

Keywords: Hilbert Space, minimization problems, Moreau-Yosida approximate, split feasibility problem

Procedia PDF Downloads 144
2133 Design and Manufacture of Non-Contact Moving Load for Experimental Analysis of Beams

Authors: Firooz Bakhtiari-Nejad, Hamidreza Rostami, Meysam Mirzaee, Mona Zandbaf

Abstract:

Dynamic tests are an important step of the design of engineering structures, because the accuracy of predictions of theoretical–numerical procedures can be assessed. In experimental test of moving loads that is one of the major research topics, the load is modeled as a simple moving mass or a small vehicle. This paper deals with the applicability of Non-Contact Moving Load (NML) for vibration analysis. For this purpose, an experimental set-up is designed to generate the different types of NML including constant and harmonic. The proposed method relies on pressurized air which is useful, especially when dealing with fragile or sensitive structures. To demonstrate the performance of this system, the set-up is employed for a modal analysis of a beam and detecting crack of the beam. The obtained results indicate that the experimental set-up for NML can be an attractive alternative to the moving load problems.

Keywords: experimental analysis, moving load, non-contact excitation, materials engineering

Procedia PDF Downloads 465
2132 Superlyophobic Surfaces for Increased Heat Transfer during Condensation of CO₂

Authors: Ingrid Snustad, Asmund Ervik, Anders Austegard, Amy Brunsvold, Jianying He, Zhiliang Zhang

Abstract:

CO₂ capture, transport and storage (CCS) is essential to mitigate global anthropogenic CO₂ emissions. To make CCS a widely implemented technology in, e.g. the power sector, the reduction of costs is crucial. For a large cost reduction, every part of the CCS chain must contribute. By increasing the heat transfer efficiency during liquefaction of CO₂, which is a necessary step, e.g. ship transportation, the costs associated with the process are reduced. Heat transfer rates during dropwise condensation are up to one order of magnitude higher than during filmwise condensation. Dropwise condensation usually occurs on a non-wetting surface (Superlyophobic surface). The vapour condenses in discrete droplets, and the non-wetting nature of the surface reduces the adhesion forces and results in shedding of condensed droplets. This, again, results in fresh nucleation sites for further droplet condensation, effectively increasing the liquefaction efficiency. In addition, the droplets in themselves have a smaller heat transfer resistance than a liquid film, resulting in increased heat transfer rates from vapour to solid. Surface tension is a crucial parameter for dropwise condensation, due to its impact on the solid-liquid contact angle. A low surface tension usually results in a low contact angle, and again to spreading of the condensed liquid on the surface. CO₂ has very low surface tension compared to water. However, at relevant temperatures and pressures for CO₂ condensation, the surface tension is comparable to organic compounds such as pentane, a dropwise condensation of CO₂ is a completely new field of research. Therefore, knowledge of several important parameters such as contact angle and drop size distribution must be gained in order to understand the nature of the condensation. A new setup has been built to measure these relevant parameters. The main parts of the experimental setup is a pressure chamber in which the condensation occurs, and a high- speed camera. The process of CO₂ condensation is visually monitored, and one can determine the contact angle, contact angle hysteresis and hence, the surface adhesion of the liquid. CO₂ condensation on different surfaces can be analysed, e.g. copper, aluminium and stainless steel. The experimental setup is built for accurate measurements of the temperature difference between the surface and the condensing vapour and accurate pressure measurements in the vapour. The temperature will be measured directly underneath the condensing surface. The next step of the project will be to fabricate nanostructured surfaces for inducing superlyophobicity. Roughness is a key feature to achieve contact angles above 150° (limit for superlyophobicity) and controlled, and periodical roughness on the nanoscale is beneficial. Surfaces that are non- wetting towards organic non-polar liquids are candidates surface structures for dropwise condensation of CO₂.

Keywords: CCS, dropwise condensation, low surface tension liquid, superlyophobic surfaces

Procedia PDF Downloads 278
2131 'CardioCare': A Cutting-Edge Fusion of IoT and Machine Learning to Bridge the Gap in Cardiovascular Risk Management

Authors: Arpit Patil, Atharav Bhagwat, Rajas Bhope, Pramod Bide

Abstract:

This research integrates IoT and ML to predict heart failure risks, utilizing the Framingham dataset. IoT devices gather real-time physiological data, focusing on heart rate dynamics, while ML, specifically Random Forest, predicts heart failure. Rigorous feature selection enhances accuracy, achieving over 90% prediction rate. This amalgamation marks a transformative step in proactive healthcare, highlighting early detection's critical role in cardiovascular risk mitigation. Challenges persist, necessitating continual refinement for improved predictive capabilities.

Keywords: cardiovascular diseases, internet of things, machine learning, cardiac risk assessment, heart failure prediction, early detection, cardio data analysis

Procedia PDF Downloads 12
2130 End-of-Life Vehicle Framework in Bumper Development Process

Authors: Majid Davoodi Makinejad, Reza Ghaeli

Abstract:

Developing sustainable and environment-friendly products has become a major concern in the car manufacturing industry. New legislation ‘End of Life Vehicle’ increased design complexities of bumper system parameters e.g. design for disassembly, design for remanufacturing and recycling. ELV processing employs dismantling, shredding and landfill. The bumper is designed to prevent physical damage, reduce aerodynamic drag force as well as being aesthetically pleasing to the consumer. Design for dismantling is the first step in ELVs approach in the bumper system. This study focused on the analysis of ELV value in redesign solutions of the bumper system in comparison with the conventional concept. It provided a guideline to address the critical consideration in material, manufacturing and joining methods of bumper components to take advantages in easy dismounting, separation and recycling.

Keywords: sustainable development, environmental friendly, bumper system, end of life vehicle

Procedia PDF Downloads 385
2129 Quantitative Structure Activity Relationship Model for Predicting the Aromatase Inhibition Activity of 1,2,3-Triazole Derivatives

Authors: M. Ouassaf, S. Belaidi

Abstract:

Aromatase is an estrogen biosynthetic enzyme belonging to the cytochrome P450 family, which catalyzes the limiting step in the conversion of androgens to estrogens. As it is relevant for the promotion of tumor cell growth. A set of thirty 1,2,3-triazole derivatives was used in the quantitative structure activity relationship (QSAR) study using regression multiple linear (MLR), We divided the data into two training and testing groups. The results showed a good predictive ability of the MLR model, the models were statistically robust internally (R² = 0.982) and the predictability of the model was tested by several parameters. including external criteria (R²pred = 0.851, CCC = 0.946). The knowledge gained in this study should provide relevant information that contributes to the origins of aromatase inhibitory activity and, therefore, facilitates our ongoing quest for aromatase inhibitors with robust properties.

Keywords: aromatase inhibitors, QSAR, MLR, 1, 2, 3-triazole

Procedia PDF Downloads 115
2128 A Step Towards Automating the Synthesis of a Scene Script

Authors: Americo Pereira, Ricardo Carvalho, Pedro Carvalho, Luis Corte-Real

Abstract:

Generating 3D content is a task mostly done by hand. It requires specific knowledge not only on how to use the tools for the task but also on the fundamentals of a 3D environment. In this work, we show that automatic generation of content can be achieved, from a scene script, by leveraging existing tools so that non-experts can easily engage in a 3D content generation without requiring vast amounts of time in exploring and learning how to use specific tools. This proposal carries several benefits, including flexible scene synthesis with different levels of detail. Our preliminary results show that the automatically generated content is comparable to the content generated by users with low experience in 3D modeling while vastly reducing the amount of time required for the generation and adds support to implement flexible scenarios for visual scene visualization.

Keywords: 3D virtualization, multimedia, scene script, synthesis

Procedia PDF Downloads 267
2127 Improving an Automotive Bumper Structure for Pedestrian Protection

Authors: Mohammad Hassan Shojaeefard, Abolfazl Khalkhali, Khashayar Ghadirinejad

Abstract:

In the present study, first, a three-dimensional finite element model of lower legform impactor according to the pedestrian protection regulation EC 78/2009 is carried out. The FE model of lower legform impactor then validated on static and dynamic tests by three main criteria which are bending angle, shear displacement and upper tibia acceleration. At the second step, the validated impactor is employed to evaluate bumper of a B-class automotive based on pedestrian protection criteria defined in EC regulation. Finally, based on some investigations an improved design for the bumper is then represented and compared with the base design. Results show that very good improvement in meeting the pedestrian protection criteria is achieved.

Keywords: pedestrian protection, legform impactor, automotive bumper, finite element method

Procedia PDF Downloads 253
2126 Proposed Alternative System for Existing Traffic Signal System

Authors: Alluri Swaroopa, L. V. N. Prasad

Abstract:

Alone with fast urbanization in world, traffic control problem became a big issue in urban construction. Having an efficient and reliable traffic control system is crucial to macro-traffic control. Traffic signal is used to manage conflicting requirement by allocating different sets of mutually compatible traffic movement during distinct time interval. Many approaches have been made proposed to solve this discrete stochastic problem. Recognizing the need to minimize right-of-way impacts while efficiently handling the anticipated high traffic volumes, the proposed alternative system gives effective design. This model allows for increased traffic capacity and reduces delays by eliminating a step in maneuvering through the freeway interchange. The concept proposed in this paper involves construction of bridges and ramps at intersection of four roads to control the vehicular congestion and to prevent traffic breakdown.

Keywords: bridges, junctions, ramps, urban traffic control

Procedia PDF Downloads 554
2125 Bio-Oil Compounds Sorption Enhanced Steam Reforming

Authors: Esther Acha, Jose Cambra, De Chen

Abstract:

Hydrogen is considered an important energy vector for the 21st century. Nowadays there are some difficulties for hydrogen economy implantation, and one of them is the high purity required for hydrogen. This energy vector is still being mainly produced from fuels, from wich hydrogen is produced as a component of a mixture containing other gases, such as CO, CO2 and H2O. A forthcoming sustainable pathway for hydrogen is steam-reforming of bio-oils derived from biomass, e.g. via fast pyrolysis. Bio-oils are a mixture of acids, alcohols, aldehydes, esters, ketones, sugars phenols, guaiacols, syringols, furans, multi-functional compounds and also up to a 30 wt% of water. The sorption enhanced steam reforming (SESR) process is attracting a great deal of attention due to the fact that it combines both hydrogen production and CO2 separation. In the SESR process, carbon dioxide is captured by an in situ sorbent, which shifts the reversible reforming and water gas shift reactions to the product side, beyond their conventional thermodynamic limits, giving rise to a higher hydrogen production and lower cost. The hydrogen containing mixture has been obtained from the SESR of bio-oil type compounds. Different types of catalysts have been tested. All of them contain Ni at around a 30 wt %. Two samples have been prepared with the wet impregnation technique over conventional (gamma alumina) and non-conventional (olivine) supports. And a third catalysts has been prepared over a hydrotalcite-like material (HT). The employed sorbent is a commercial dolomite. The activity tests were performed in a bench-scale plant (PID Eng&Tech), using a stainless steel fixed bed reactor. The catalysts were reduced in situ in the reactor, before the activity tests. The effluent stream was cooled down, thus condensed liquid was collected and weighed, and the gas phase was analysed online by a microGC. The hydrogen yield, and process behavior was analysed without the sorbent (the traditional SR where a second purification step will be needed but that operates in steady state) and the SESR (where the purification step could be avoided but that operates in batch state). The influence of the support type and preparation method will be observed in the produced hydrogen yield. Additionally, the stability of the catalysts is critical, due to the fact that in SESR process sorption-desorption steps are required. The produced hydrogen yield and hydrogen purity has to be high and also stable, even after several sorption-desorption cycles. The prepared catalysts were characterized employing different techniques to determine the physicochemical properties of the fresh-reduced and used (after the activity tests) materials. The characterization results, together with the activity results show the influence of the catalysts preparation method, calcination temperature, or can even explain the observed yield and conversion.

Keywords: CO2 sorbent, enhanced steam reforming, hydrogen

Procedia PDF Downloads 579
2124 Extraction of Rice Bran Protein Using Enzymes and Polysaccharide Precipitation

Authors: Sudarat Jiamyangyuen, Tipawan Thongsook, Riantong Singanusong, Chanida Saengtubtim

Abstract:

Rice is a staple food as well as exported commodity of Thailand. Rice bran, a 10.5% constituent of rice grain, is a by-product of rice milling process. Rice bran is normally used as a raw material for rice bran oil production or sold as feed with a low price. Therefore, this study aimed to increase value of defatted rice bran as obtained after extracting of rice bran oil. Conventionally, the protein in defatted rice bran was extracted using alkaline extraction and acid precipitation, which results in reduction of nutritious components in rice bran. Rice bran protein concentrate is suitable for those who are allergenic of protein from other sources eg. milk, wheat. In addition to its hypoallergenic property, rice bran protein also contains good quantity of lysine. Thus it may act as a suitable ingredient for infant food formulations while adding variety to the restricted diets of children with food allergies. The objectives of this study were to compare properties of rice bran protein concentrate (RBPC) extracted from defatted rice bran using enzymes together with precipitation step using polysaccharides (alginate and carrageenan) to those of a control sample extracted using a conventional method. The results showed that extraction of protein from rice bran using enzymes exhibited the higher protein recovery compared to that extraction with alkaline. The extraction conditions using alcalase 2% (v/w) at 50 C, pH 9.5 gave the highest protein (2.44%) and yield (32.09%) in extracted solution compared to other enzymes. Rice bran protein concentrate powder prepared by a precipitation step using alginate (protein in solution: alginate 1:0.006) exhibited the highest protein (27.55%) and yield (6.62%). Precipitation using alginate was better than that of acid. RBPC extracted with alkaline (ALK) or enzyme alcalase (ALC), then precipitated with alginate (AL) (samples RBP-ALK-AL and RBP-ALC-AL) yielded the precipitation rate of 75% and 91.30%, respectively. Therefore, protein precipitation using alginate was then selected. Amino acid profile of control sample, and sample precipitated with alginate, as compared to casein and soy protein isolated, showed that control sample showed the highest content among all sample. Functional property study of RBP showed that the highest nitrogen solubility occurred in pH 8-10. There was no statically significant between emulsion capacity and emulsion stability of control and sample precipitated by alginate. However, control sample showed a higher of foaming and lower foam stability compared to those of sample precipitated with alginate. The finding was successful in terms of minimizing chemicals used in extraction and precipitation steps in preparation of rice bran protein concentrate. This research involves in a production of value-added product in which the double amount of protein (28%) compared to original amount (14%) contained in rice bran could be beneficial in terms of adding to food products eg. healthy drink with high protein and fiber. In addition, the basic knowledge of functional property of rice bran protein concentrate was obtained, which can be used to appropriately select the application of this value-added product from rice bran.

Keywords: alginate, carrageenan, rice bran, rice bran protein

Procedia PDF Downloads 295
2123 Study of Nano Clay Based on Pet

Authors: F. Zouai, F. Z. Benabid, S. Bouhelal, D. Benachoura

Abstract:

A (PET)/clay nano composites has been successfully performed in one step by reactive melt extrusion. The PEN was first mixed in the melt state with different amounts of functionalized clay. It was observed that the composition PET/4 wt% clay showed total exfoliation. These completely exfoliated composition called nPET, was used to prepare new nPET nano composites in the same mixing batch. The nPEN was compared to neat PET. The nanocomposites were characterized by different techniques: differential scanning calorimetry (DSC) and wide-angle X-ray scattering (WAXS). The micro and nanostructure/properties relationships were investigated. From the different WAXS patterns, it is seen that all samples are amorphous phase. In addition, nPET blends present lower Tc values and higher Tm values than the corresponding neat PET. The present study allowed establishing good correlations between the different measured properties.

Keywords: PET, montmorillonite, nanocomposites, exfoliation, reactive melt-mixing

Procedia PDF Downloads 403
2122 Study on Network-Based Technology for Detecting Potentially Malicious Websites

Authors: Byung-Ik Kim, Hong-Koo Kang, Tae-Jin Lee, Hae-Ryong Park

Abstract:

Cyber terrors against specific enterprises or countries have been increasing recently. Such attacks against specific targets are called advanced persistent threat (APT), and they are giving rise to serious social problems. The malicious behaviors of APT attacks mostly affect websites and penetrate enterprise networks to perform malevolent acts. Although many enterprises invest heavily in security to defend against such APT threats, they recognize the APT attacks only after the latter are already in action. This paper discusses the characteristics of APT attacks at each step as well as the strengths and weaknesses of existing malicious code detection technologies to check their suitability for detecting APT attacks. It then proposes a network-based malicious behavior detection algorithm to protect the enterprise or national networks.

Keywords: Advanced Persistent Threat (APT), malware, network security, network packet, exploit kits

Procedia PDF Downloads 366
2121 Blue Hydrogen Production Via Catalytic Aquathermolysis Coupled with Direct Carbon Dioxide Capture Via Adsorption

Authors: Sherif Fakher

Abstract:

Hydrogen has been gaining a lot of global attention as an uprising contributor in the energy sector. Labeled as an energy carrier, hydrogen is used in many industries and can be used to generate electricity via fuel cells. Blue hydrogen involves the production of hydrogen from hydrocarbons using different processes that emit CO₂. However, the CO₂ is captured and stored. Hence, very little environmental damage occurs during the hydrogen production process. This research investigates the ability to use different catalysts for the production of hydrogen from different hydrocarbon sources, including coal, oil, and gas, using a two-step Aquathermolysis reaction. The research presents the results of experiments conducted to evaluate different catalysts and also highlights the main advantages of this process over other blue hydrogen production methods, including methane steam reforming, autothermal reforming, and oxidation. Two methods of hydrogen generation were investigated including partial oxidation and aquathermolysis. For those two reactions, the reaction kinetics, thermodynamics, and medium were all investigated. Following this, experiments were conducted to test the hydrogen generation potential from both methods. The porous media tested were sandstone, ash, and prozzolanic material. The spent oils used were spent motor oil and spent vegetable oil from cooking. Experiments were conducted at temperatures up to 250 C and pressures up to 3000 psi. Based on the experimental results, mathematical models were developed to predict the hydrogen generation potential at higher thermodynamic conditions. Since both partial oxidation and aquathermolysis require relatively high temperatures to undergo, it was important to devise a method by which these high temperatures can be generated at a low cost. This was done by investigating two factors, including the porous media used and the reliance on the spent oil. Of all the porous media used, the ash had the highest thermal conductivity. The second step was the partial combustion of part of the spent oil to generate the heat needed to reach the high temperatures. This reduced the cost of the heat generation significantly. For the partial oxidation reaction, the spent oil was burned in the presence of a limited oxygen concentration to generate carbon monoxide. The main drawback of this process was the need for burning. This resulted in the generation of other harmful and environmentally damaging gases. Aquathermolysis does not rely on burning, which makes it the cleaner alternative. However, it needs much higher temperatures to run the reaction. When comparing the hydrogen generation potential for both using gas chromatography, aquathermolysis generated 23% more hydrogen using the same volume of spent oil compared to partial oxidation. This research introduces the concept of using spent oil for hydrogen production. This can be a very promising method to produce a clean source of energy using a waste product. This can also help reduce the reliance on freshwater for hydrogen generation which can divert the usage of freshwater to other more important applications.

Keywords: blue hydrogen production, catalytic aquathermolysis, direct carbon dioxide capture, CCUS

Procedia PDF Downloads 31
2120 Smart Model with the DEMATEL and ANFIS Multistage to Assess the Value of the Brand

Authors: Hamed Saremi

Abstract:

One of the challenges in manufacturing and service companies to provide a product or service is recognized Brand to consumers in target markets. They provide most of their processes under the same capacity. But the constant threat of devastating internal and external resources to prevent a rise Brands and more companies are recognizing the stages are bankrupt. This paper has tried to identify and analyze effective indicators of brand equity and focuses on indicators and presents a model of intelligent create a model to prevent possible damage. In this study identified indicators of brand equity based on literature study and according to expert opinions, set of indicators By techniques DEMATEL Then to used Multi-Step Adaptive Neural-Fuzzy Inference system (ANFIS) to design a multi-stage intelligent system for assessment of brand equity.

Keywords: anfis, dematel, brand, cosmetic product, brand value

Procedia PDF Downloads 410
2119 Effect of Impurities in the Chlorination Process of TiO2

Authors: Seok Hong Min, Tae Kwon Ha

Abstract:

With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.

Keywords: rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage

Procedia PDF Downloads 308
2118 Towards a Resources Provisioning for Dynamic Workflows in the Cloud

Authors: Fairouz Fakhfakh, Hatem Hadj Kacem, Ahmed Hadj Kacem

Abstract:

Cloud computing offers a new model of service provisioning for workflow applications, thanks to its elasticity and its paying model. However, it presents various challenges that need to be addressed in order to be efficiently utilized. The resources provisioning problem for workflow applications has been widely studied. Nevertheless, the existing works did not consider the change in workflow instances while they are being executed. This functionality has become a major requirement to deal with unusual situations and evolution. This paper presents a first step towards the resources provisioning for a dynamic workflow. In fact, we propose a provisioning algorithm which minimizes the overall workflow execution cost, while meeting a deadline constraint. Then, we extend it to support the dynamic adding of tasks. Experimental results show that our proposed heuristic demonstrates a significant reduction in resources cost by using a consolidation process.

Keywords: cloud computing, resources provisioning, dynamic workflow, workflow applications

Procedia PDF Downloads 295
2117 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 146
2116 Stress Variation of Underground Building Structure during Top-Down Construction

Authors: Soo-yeon Seo, Seol-ki Kim, Su-jin Jung

Abstract:

In the construction of a building, it is necessary to minimize construction period and secure enough work space for stacking of materials during the construction especially in city area. In this manner, various top-down construction methods have been developed and widely used in Korea. This paper investigates the stress variation of underground structure of a building constructed by using SPS (Strut as Permanent System) known as a top-down method in Korea through an analytical approach. Various types of earth pressure distribution related to ground condition were considered in the structural analysis of an example structure at each step of the excavation. From the analysis, the most high member force acting on beams was found when the ground type was medium sandy soil and a stress concentration was found in corner area.

Keywords: construction of building, top-down construction method, earth pressure distribution, member force, stress concentration

Procedia PDF Downloads 307
2115 Self-Supervised Learning for Hate-Speech Identification

Authors: Shrabani Ghosh

Abstract:

Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.

Keywords: attention learning, language model, offensive language detection, self-supervised learning

Procedia PDF Downloads 106
2114 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows

Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican

Abstract:

This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.

Keywords: laboratory-process, optimization, pathology, computer simulation, workflow

Procedia PDF Downloads 286
2113 Comparative Analysis of Spectral Estimation Methods for Brain-Computer Interfaces

Authors: Rafik Djemili, Hocine Bourouba, M. C. Amara Korba

Abstract:

In this paper, we present a method in order to classify EEG signals for Brain-Computer Interfaces (BCI). EEG signals are first processed by means of spectral estimation methods to derive reliable features before classification step. Spectral estimation methods used are standard periodogram and the periodogram calculated by the Welch method; both methods are compared with Logarithm of Band Power (logBP) features. In the method proposed, we apply Linear Discriminant Analysis (LDA) followed by Support Vector Machine (SVM). Classification accuracy reached could be as high as 85%, which proves the effectiveness of classification of EEG signals based BCI using spectral methods.

Keywords: brain-computer interface, motor imagery, electroencephalogram, linear discriminant analysis, support vector machine

Procedia PDF Downloads 499
2112 DEA-Based Variable Structure Position Control of DC Servo Motor

Authors: Ladan Maijama’a, Jibril D. Jiya, Ejike C. Anene

Abstract:

This paper presents Differential Evolution Algorithm (DEA) based Variable Structure Position Control (VSPC) of Laboratory DC servomotor (LDCSM). DEA is employed for the optimal tuning of Variable Structure Control (VSC) parameters for position control of a DC servomotor. The VSC combines the techniques of Sliding Mode Control (SMC) that gives the advantages of small overshoot, improved step response characteristics, faster dynamic response and adaptability to plant parameter variations, suppressed influences of disturbances and uncertainties in system behavior. The results of the simulation responses of the VSC parameters adjustment by DEA were performed in Matlab Version 2010a platform and yield better dynamic performance compared with the untuned VSC designed.

Keywords: differential evolution algorithm, laboratory DC servomotor, sliding mode control, variable structure control

Procedia PDF Downloads 415
2111 Contribution to Energy Management in Hybrid Energy Systems Based on Agents Coordination

Authors: Djamel Saba, Fatima Zohra Laallam, Brahim Berbaoui

Abstract:

This paper presents a contribution to the design of a multi-agent for the energy management system in a hybrid energy system (SEH). The multi-agent-based energy-coordination management system (MA-ECMS) is based mainly on coordination between agents. The agents share the tasks and exchange information through communications protocols to achieve the main goal. This intelligent system can fully manage the consumption and production or simply to make proposals for action he thinks is best. The initial step is to give a presentation for the system that we want to model in order to understand all the details as much as possible. In our case, it is to implement a system for simulating a process control of energy management.

Keywords: communications protocols, control process, energy management, hybrid energy system, modelization, multi-agents system, simulation

Procedia PDF Downloads 333