Search results for: mining process line
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17928

Search results for: mining process line

16608 Comparing the Contribution of General Vocabulary Knowledge and Academic Vocabulary Knowledge to Learners' Academic Achievement

Authors: Reem Alsager, James Milton

Abstract:

Coxhead’s (2000) Academic Word List (AWL) believed to be essential for students pursuing higher education and helps differentiate English for Academic Purposes (EAP) from General English as a course of study, and it is thought to be important for comprehending English academic texts. It has been described that AWL is an infrequent, discrete set of vocabulary items unreachable from general language. On the other hand, it has been known for a period of time that general vocabulary knowledge is a good predictor of academic achievement. This study, however, is an attempt to measure and compare the contribution of academic knowledge and general vocabulary knowledge to learners’ GPA and examine what knowledge is a better predictor of academic achievement and investigate whether AWL as a specialised list of infrequent words relates to the frequency effect. The participants were comprised of 44 international postgraduate students in Swansea University, all from the School of Management, following the taught MSc (Master of Science). The study employed the Academic Vocabulary Size Test (AVST) and the XK_Lex vocabulary size test. The findings indicate that AWL is a list based on word frequency rather than a discrete and unique word list and that the AWL performs the same function as general vocabulary, with tests of each found to measure largely the same quality of knowledge. The findings also suggest that the contribution that AWL knowledge provides for academic success is not sufficient and that general vocabulary knowledge is better in predicting academic achievement. Furthermore, the contribution that academic knowledge added above the contribution of general vocabulary knowledge when combined is really small and noteworthy. This study’s results are in line with the argument and suggest that it is the development of general vocabulary size is an essential quality for academic success and acquiring the words of the AWL will form part of this process. The AWL by itself does not provide sufficient coverage, and is probably not specialised enough, for knowledge of this list to influence this general process. It can be concluded that AWL as an academic word list epitomizes only a fraction of words that are actually needed for academic success in English and that knowledge of academic vocabulary combined with general vocabulary knowledge above the most frequent 3000 words is what matters most to ultimate academic success.

Keywords: academic achievement, academic vocabulary, general vocabulary, vocabulary size

Procedia PDF Downloads 216
16607 The Impact of Business Process Reengineering to the Company Performance through TQM and Enterprise Resource Planning Implementation on Manufacturing Companies in East Java, Indonesia

Authors: Widjojo Suprapto, Zeplin Jiwa Husada Tarigan, Sautma Ronni Basana

Abstract:

Business process reengineering can be conducted by some procedure rationalization for all related departments in a company so that all data and business processes are connected. The changing of any business process is used to set up the working standard so that it gives an impact to the implementation of ERP and the company performance. After collecting and processing the data from 77 manufacturing companies, it is obtained that BPR (Business Process Reengineering) has no direct impact on the implementation of ERP (Enterprise Resource Planning) in the companies and manufacturing performance; however, it influences the implementation of TQM. The implementation of TQM influences directly the implementation of ERP, but it does not influence directly the company performance. The implementation of ERP gives a significant increase in the work performance of the manufacturing companies in East Java.

Keywords: enterprise resources planning, business process reengineering, TQM, company performance

Procedia PDF Downloads 201
16606 The Curse of Natural Resources: An Empirical Analysis Applied to the Case of Copper Mining in Zambia

Authors: Chomba Kalunga

Abstract:

Many developing countries have a rich endowment of natural resources. Yet, amidst that wealth, living standards remain poor. At the same time, international markets have been surged with an increase in copper prices in the last twenty years. This is a presentation of the findings on the causal economic impact of Zambia’s copper mines, a country located in sub-Saharan Africa endowed with vast copper deposits on living standards using household data from 1996 to 2010, exploiting an episode where the copper prices on the international market were rising. Using an Instrumental Variable approach and controlling for constituency-level and microeconomic factors, the results show a significant impact of copper production on living standards. After splitting the constituencies close to and far away from the nearest mine, the results document that constituencies close to the mines benefited significantly from the increase in copper production, compared to their counterparts through increased levels of employment. Finally, the results are not consistent with the natural resource curse hypothesis; findings show a positive causal relationship between the presence of natural resources and socioeconomic outcomes in less developed countries, particularly for constituencies close to the mines in Zambia. Some key policy implications follow from the findings. The finding that increased copper production led to an increase in employment suggests that, in Zambias’ context, policies that promote local employment may be more beneficial to residents. Meaning that it is government policies that can help improve the living standards were government needs to work towards making this impact more substantial.

Keywords: copper prices, local development, mining, natural resources

Procedia PDF Downloads 207
16605 Development of a Thermodynamic Model for Ladle Metallurgy Steel Making Processes Using Factsage and Its Macro Facility

Authors: Prasenjit Singha, Ajay Kumar Shukla

Abstract:

To produce high-quality steel in larger volumes, dynamic control of composition and temperature throughout the process is essential. In this paper, we developed a mass transfer model based on thermodynamics to simulate the ladle metallurgy steel-making process using FactSage and its macro facility. The overall heat and mass transfer processes consist of one equilibrium chamber, two non-equilibrium chambers, and one adiabatic reactor. The flow of material, as well as heat transfer, occurs across four interconnected unit chambers and a reactor. We used the macro programming facility of FactSage™ software to understand the thermochemical model of the secondary steel making process. In our model, we varied the oxygen content during the process and studied their effect on the composition of the final hot metal and slag. The model has been validated with respect to the plant data for the steel composition, which is similar to the ladle metallurgy steel-making process in the industry. The resulting composition profile serves as a guiding tool to optimize the process of ladle metallurgy in steel-making industries.

Keywords: desulphurization, degassing, factsage, reactor

Procedia PDF Downloads 207
16604 Microstructural and Mechanical Characterization of a 16MND5 Steel Manufactured by Innovative WAAM SAW Process

Authors: F. Villaret, I. Jacot, Y. Shen, Z. Kong, T. XU, Y. Wang, D. Lu

Abstract:

Wire Arc Additive Manufacturing (WAAM) allows the rapid production of large, homogeneous parts with complex geometry. However, in the nuclear field, parts can reach dimensions of ten to a hundred tons. In this case, the usual WAAM TIG or CMT processes do not have sufficient deposition rates to consider the manufacture of parts of such dimensions within a reasonable time. The submerged arc welding process (SAW, Submerged Arc Welding) allows much higher deposition rates. Although there are very few references to this process for additive manufacturing in the literature, it has been used for a long time for the welding and coating of nuclear power plant vessels, so this process is well-known and mastered as a welding process. This study proposes to evaluate the SAW process as an additive manufacturing technique by taking as an example a low-alloy steel of type 16MND5. In the first step, a parametric study allowed the evaluation of the effect of the different parameters and the deposition rate on the geometry of the beads and their microstructure. Larger parts were also fabricated and characterized by metallography and mechanical tests (tensile, impact, toughness). The effect of different heat treatments on the microstructure is also studied.

Keywords: WAAM, low alloy steel, submerged arc, caracterization

Procedia PDF Downloads 79
16603 Data-Driven Decision Making: A Reference Model for Organizational, Educational and Competency-Based Learning Systems

Authors: Emanuel Koseos

Abstract:

Data-Driven Decision Making (DDDM) refers to making decisions that are based on historical data in order to inform practice, develop strategies and implement policies that benefit organizational settings. In educational technology, DDDM facilitates the implementation of differential educational learning approaches such as Educational Data Mining (EDM) and Competency-Based Education (CBE), which commonly target university classrooms. There is a current need for DDDM models applied to middle and secondary schools from a concern for assessing the needs, progress and performance of students and educators with respect to regional standards, policies and evolution of curriculums. To address these concerns, we propose a DDDM reference model developed using educational key process initiatives as inputs to a machine learning framework implemented with statistical software (SAS, R) to provide a best-practices, complex-free and automated approach for educators at their regional level. We assessed the efficiency of the model over a six-year period using data from 45 schools and grades K-12 in the Langley, BC, Canada regional school district. We concluded that the model has wider appeal, such as business learning systems.

Keywords: competency-based learning, data-driven decision making, machine learning, secondary schools

Procedia PDF Downloads 167
16602 Process Integration: Mathematical Model for Contaminant Removal in Refinery Process Stream

Authors: Wasif Mughees, Malik Al-Ahmad

Abstract:

This research presents the graphical design analysis and mathematical programming technique to dig out the possible water allocation distribution to minimize water usage in process units. The study involves the mass and property integration in its core methodology. Tehran Oil Refinery is studied to implement the focused water pinch technology for regeneration, reuse and recycling of water streams. Process data is manipulated in terms of sources and sinks, which are given in terms of properties. Sources are the streams to be allocated. Sinks are the units which can accept the sources. Suspended Solids (SS) is taken as a single contaminant. The model minimizes the mount of freshwater from 340 to 275m3/h (19.1%). Redesigning and allocation of water streams was built. The graphical technique and mathematical programming shows the consistency of results which confirms mass transfer dependency of water streams.

Keywords: minimization, water pinch, process integration, pollution prevention

Procedia PDF Downloads 314
16601 Approximating Maximum Speed on Road from Curvature Information of Bezier Curve

Authors: M. Yushalify Misro, Ahmad Ramli, Jamaludin M. Ali

Abstract:

Bezier curves have useful properties for path generation problem, for instance, it can generate the reference trajectory for vehicles to satisfy the path constraints. Both algorithms join cubic Bezier curve segment smoothly to generate the path. Some of the useful properties of Bezier are curvature. In mathematics, the curvature is the amount by which a geometric object deviates from being flat, or straight in the case of a line. Another extrinsic example of curvature is a circle, where the curvature is equal to the reciprocal of its radius at any point on the circle. The smaller the radius, the higher the curvature thus the vehicle needs to bend sharply. In this study, we use Bezier curve to fit highway-like curve. We use the different approach to finding the best approximation for the curve so that it will resemble highway-like curve. We compute curvature value by analytical differentiation of the Bezier Curve. We will then compute the maximum speed for driving using the curvature information obtained. Our research works on some assumptions; first the Bezier curve estimates the real shape of the curve which can be verified visually. Even, though, the fitting process of Bezier curve does not interpolate exactly on the curve of interest, we believe that the estimation of speed is acceptable. We verified our result with the manual calculation of the curvature from the map.

Keywords: speed estimation, path constraints, reference trajectory, Bezier curve

Procedia PDF Downloads 370
16600 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 226
16599 Portable Hands-Free Process Assistant for Gas Turbine Maintenance

Authors: Elisabeth Brandenburg, Robert Woll, Rainer Stark

Abstract:

This paper presents how smart glasses and voice commands can be used for improving the maintenance process of industrial gas turbines. It presents the process of inspecting a gas turbine’s combustion chamber and how it is currently performed using a set of paper-based documents. In order to improve this process, a portable hands-free process assistance system has been conceived. In the following, it will be presented how the approach of user-centered design and the method of paper prototyping have been successfully applied in order to design a user interface and a corresponding workflow model that describes the possible interaction patterns between the user and the interface. The presented evaluation of these results suggests that the assistance system could help the user by rendering multiple manual activities obsolete, thus allowing him to work hands-free and to save time for generating protocols.

Keywords: paper prototyping, smart glasses, turbine maintenance, user centered design

Procedia PDF Downloads 315
16598 Conceptual Design of Panel Based Reinforced Concrete Floating Substructure for 10 MW Offshore Wind Turbine

Authors: M. Sohail Hasan, Wichuda Munbua, Chikako Fujiyama, Koichi Maekawa

Abstract:

During the past few years, offshore wind energy has become the key parameter to reduce carbon emissions. In most of the previous studies, floaters in floating offshore wind turbines (FOWT) are made up of steel. However, fatigue and corrosion are always major concerns of steel marine structures. Recently, researchers are working on concrete floating substructures. In this paper, the conceptual design of pre-cast panel-based economical and durable reinforced concrete floating substructure for a 10 MW offshore wind turbine is proposed. The new geometrical shape, i.e., hexagon with inside hollow boxes, is proposed under static conditions. To design the outer panel/side walls to resist hydrostatic forces, special consideration for durability is given to limit the crack width within permissible range under service limit state. A comprehensive system is proposed for transferring the ultimate moment and shear due to strong wind at the connection between steel tower and concrete floating substructure. Moreover, a stable connection is also designed considering the fatigue of concrete and steel due to the fluctuation of stress from the mooring line. This conceptual design will be verified by subsequent dynamic analysis soon.

Keywords: cracks width control, mooring line, reinforced concrete floater, steel tower

Procedia PDF Downloads 213
16597 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico

Authors: Ismene Ithai Bras-Ruiz

Abstract:

Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.

Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise

Procedia PDF Downloads 124
16596 Optimizing Cell Culture Performance in an Ambr15 Microbioreactor Using Dynamic Flux Balance and Computational Fluid Dynamic Modelling

Authors: William Kelly, Sorelle Veigne, Xianhua Li, Zuyi Huang, Shyamsundar Subramanian, Eugene Schaefer

Abstract:

The ambr15™ bioreactor is a single-use microbioreactor for cell line development and process optimization. The ambr system offers fully automatic liquid handling with the possibility of fed-batch operation and automatic control of pH and oxygen delivery. With operating conditions for large scale biopharmaceutical production properly scaled down, micro bioreactors such as the ambr15™ can potentially be used to predict the effect of process changes such as modified media or different cell lines. In this study, gassing rates and dilution rates were varied for a semi-continuous cell culture system in the ambr15™ bioreactor. The corresponding changes to metabolite production and consumption, as well as cell growth rate and therapeutic protein production were measured. Conditions were identified in the ambr15™ bioreactor that produced metabolic shifts and specific metabolic and protein production rates also seen in the corresponding larger (5 liter) scale perfusion process. A Dynamic Flux Balance model was employed to understand and predict the metabolic changes observed. The DFB model-predicted trends observed experimentally, including lower specific glucose consumption when CO₂ was maintained at higher levels (i.e. 100 mm Hg) in the broth. A Computational Fluid Dynamic (CFD) model of the ambr15™ was also developed, to understand transfer of O₂ and CO₂ to the liquid. This CFD model predicted gas-liquid flow in the bioreactor using the ANSYS software. The two-phase flow equations were solved via an Eulerian method, with population balance equations tracking the size of the gas bubbles resulting from breakage and coalescence. Reasonable results were obtained in that the Carbon Dioxide mass transfer coefficient (kLa) and the air hold up increased with higher gas flow rate. Volume-averaged kLa values at 500 RPM increased as the gas flow rate was doubled and matched experimentally determined values. These results form a solid basis for optimizing the ambr15™, using both CFD and FBA modelling approaches together, for use in microscale simulations of larger scale cell culture processes.

Keywords: cell culture, computational fluid dynamics, dynamic flux balance analysis, microbioreactor

Procedia PDF Downloads 272
16595 SEM Image Classification Using CNN Architectures

Authors: Güzi̇n Ti̇rkeş, Özge Teki̇n, Kerem Kurtuluş, Y. Yekta Yurtseven, Murat Baran

Abstract:

A scanning electron microscope (SEM) is a type of electron microscope mainly used in nanoscience and nanotechnology areas. Automatic image recognition and classification are among the general areas of application concerning SEM. In line with these usages, the present paper proposes a deep learning algorithm that classifies SEM images into nine categories by means of an online application to simplify the process. The NFFA-EUROPE - 100% SEM data set, containing approximately 21,000 images, was used to train and test the algorithm at 80% and 20%, respectively. Validation was carried out using a separate data set obtained from the Middle East Technical University (METU) in Turkey. To increase the accuracy in the results, the Inception ResNet-V2 model was used in view of the Fine-Tuning approach. By using a confusion matrix, it was observed that the coated-surface category has a negative effect on the accuracy of the results since it contains other categories in the data set, thereby confusing the model when detecting category-specific patterns. For this reason, the coated-surface category was removed from the train data set, hence increasing accuracy by up to 96.5%.

Keywords: convolutional neural networks, deep learning, image classification, scanning electron microscope

Procedia PDF Downloads 118
16594 Component Comparison of Polyaluminum Chloride Produced from Various Methods

Authors: Wen Po Cheng, Chia Yun Chung, Ruey Fang Yu, Chao Feng Chen

Abstract:

The main objective of this research was to study the differences of aluminum hydrolytic products between two PACl preparation methods. These two methods were the acidification process of freshly formed amorphous Al(OH)3 and the conventional alkalization process of aluminum chloride solution. According to Ferron test and 27Al NMR analysis of those two PACl preparation procedures, the reaction rate constant (k) values and Al13 percentage of acid addition process at high basicity value were both lower than those values of the alkaline addition process. The results showed that the molecular structure and size distribution of the aluminum species in both preparing methods were suspected to be significantly different at high basicity value.

Keywords: polyaluminum chloride, Al13, amorphous aluminum hydroxide, Ferron test

Procedia PDF Downloads 368
16593 Determination of the Toxicity of a Lunar Dust Simulant on Human Alveolar Epithelial Cells and Macrophages in vitro

Authors: Agatha Bebbington, Terry Tetley, Kathryn Hadler

Abstract:

Background: Astronauts will set foot on the Moon later this decade, and are at high risk of lunar dust inhalation. Freshly-fractured lunar dust produces reactive oxygen species in solution, which are known to cause cellular damage and inflammation. Cytotoxicity and inflammatory mediator release was measured in pulmonary alveolar epithelial cells (cells that line the gas-exchange zone of the lung) exposed to a lunar dust simulant, LMS-1. It was hypothesised that freshly-fractured LMS-1 would result in increased cytotoxicity and inflammatory mediator release, owing to the angular morphology and high reactivity of fractured particles. Methods: A human alveolar epithelial type 1-like cell line (TT1) and a human macrophage-like cell line (THP-1) were exposed to 0-200μg/ml of unground, aged-ground, and freshly-ground LMS-1 (screened at <22μm). Cell viability, cytotoxicity, and inflammatory mediator release (IL-6, IL-8) were assessed using MMT, LDH, and ELISA assays, respectively. LMS-1 particles were characterised for their size, surface area, and morphology before and after grinding. Results: Exposure to LMS-1 particles did not result in overt cytotoxicity in either TT1 epithelial cells or THP-1 macrophage-like cells. A dose-dependent increase in IL-8 release was observed in TT1 cells, whereas THP-1 cell exposure, even at low particle concentrations, resulted in increased IL-8 release. Both cytotoxic and pro-inflammatory responses were most marked and significantly greater in TT1 and THP-1 cells exposed to freshly-fractured LMS-1. Discussion: LMS-1 is a novel lunar dust simulant; this is the first study to determine its toxicological effects on respiratory cells in vitro. An increased inflammatory response in TT1 and THP-1 cells exposed to ground LMS-1 suggests that low particle size, increased surface area, and angularity likely contribute to toxicity. Conclusions: Evenlow levels of exposure to LMS-1 could result in alveolar inflammation. This may have pathological consequences for astronauts exposed to lunar dust on future long-duration missions. Future research should test the effect of low-dose, intermittent lunar dust exposure on the respiratory system.

Keywords: lunar dust, LMS-1, lunar dust simulant, long-duration space travel, lunar dust toxicity

Procedia PDF Downloads 207
16592 Reliability Assessment for Tie Line Capacity Assistance of Power Systems Based on Multi-Agent System

Authors: Nadheer A. Shalash, Abu Zaharin Bin Ahmad

Abstract:

Technological developments in industrial innovations have currently been related to interconnected system assistance and distribution networks. This important in order to enable an electrical load to continue receive power in the event of disconnection of load from the main power grid. This paper represents a method for reliability assessment of interconnected power systems based. The multi-agent system consists of four agents. The first agent was the generator agent to using as connected the generator to the grid depending on the state of the reserve margin and the load demand. The second was a load agent is that located at the load. Meanwhile, the third is so-called "the reverse margin agent" that to limit the reserve margin between 0-25% depend on the load and the unit size generator. In the end, calculation reliability Agent can be calculate expected energy not supplied (EENS), loss of load expectation (LOLE) and the effecting of tie line capacity to determine the risk levels Roy Billinton Test System (RBTS) can use to evaluated the reliability indices by using the developed JADE package. The results estimated of the reliability interconnection power systems presented in this paper. The overall reliability of power system can be improved. Thus, the market becomes more concentrated against demand increasing and the generation units were operating in relation to reliability indices.

Keywords: reliability indices, load expectation, reserve margin, daily load, probability, multi-agent system

Procedia PDF Downloads 319
16591 Co-Gasification Process for Green and Blue Hydrogen Production: Innovative Process Development, Economic Analysis, and Exergy Assessment

Authors: Yousaf Ayub

Abstract:

A co-gasification process, which involves the utilization of both biomass and plastic waste, has been developed to enable the production of blue and green hydrogen. To support this endeavor, an Aspen Plus simulation model has been meticulously created, and sustainability analysis is being conducted, focusing on economic viability, energy efficiency, advanced exergy considerations, and exergoeconomics evaluations. In terms of economic analysis, the process has demonstrated strong economic sustainability, as evidenced by an internal rate of return (IRR) of 8% at a process efficiency level of 70%. At present, the process has the potential to generate approximately 1100 kWh of electric power, with any excess electricity, beyond meeting the process requirements, capable of being harnessed for green hydrogen production via an alkaline electrolysis cell (AEC). This surplus electricity translates to a potential daily hydrogen production of around 200 kg. The exergy analysis of the model highlights that the gasifier component exhibits the lowest exergy efficiency, resulting in the highest energy losses, amounting to approximately 40%. Additionally, advanced exergy analysis findings pinpoint the gasifier as the primary source of exergy destruction, totaling around 9000 kW, with associated exergoeconomics costs amounting to 6500 $/h. Consequently, improving the gasifier's performance is a critical focal point for enhancing the overall sustainability of the process, encompassing energy, exergy, and economic considerations.

Keywords: blue hydrogen, green hydrogen, co-gasification, waste valorization, exergy analysis

Procedia PDF Downloads 53
16590 Permanent Reduction of Arc Flash Energy to Safe Limit on Line Side of 480 Volt Switchgear Incomer Breaker

Authors: Abid Khan

Abstract:

A recognized engineering challenge is related to personnel protection from fatal arc flash incident energy in the line side of the 480-volt switchgear incomer breakers during maintenance activities. The incident energy is typically high due to slow fault clearance, and it can be higher than the available personnel protective equipment (PPE) ratings. A fault in this section of the switchgear is cleared by breakers or fuses in the upstream higher voltage system (4160 Volt or higher). The current reflection in the higher voltage upstream system for a fault in the 480-volt switchgear is low, the clearance time is slower, and the inversely proportional incident energy is hence higher. The installation of overcurrent protection at a 480-volt system upstream of the incomer breaker will operate fast enough and trips the upstream higher voltage breaker when a fault develops at the incomer breaker. Therefore, fault current reduction as reflected in the upstream higher voltage system is eliminated. Since the fast overcurrent protection is permanently installed, it is always functional, does not require human interventions, and eliminates exposure to human errors. It is installed at the maintenance activities location, and its operations can be locally monitored by craftsmen during maintenance activities.

Keywords: arc flash, mitigation, maintenance switch, energy level

Procedia PDF Downloads 188
16589 The Effects of Highly Active Antiretroviral Therapy (HAART) on the Expression of Muc1 and P65 in a Cervical Cancer Cell Line, HCS-2

Authors: K. R. Thabethe, G. A. Adefolaju, M. J. Hosie

Abstract:

Cervical cancer is the third most commonly diagnosed cancer globally and it is one of three AIDS defining malignancies. Highly active antiretroviral therapy (HAART) is a combination of three or more antiretroviral drugs and has been shown to play a significant role in reducing the incidence of some AIDS defining malignancies, although its effect on cervical cancer is still unclear. The aim of this study was to investigate the relationship between cervical cancer and HAART. This was achieved by studying the expression of two signalling molecules expressed in cervical cancer; MUC1 and P65. Following the 24 hour treatment of a cervical cancer cell line, HCS-2, with drugs which are commonly used as part of HAART at their clinical plasma concentrations, real-time qPCR and immunofluorescence were used in order to study gene and protein expression. A one way ANOVA followed by a Tukey Kramer Post Hoc test was conducted using JMP 11 software on both sets of data. The drug classified as a protease inhibitor (PI) (i.e. LPV/r) reduced MUC1 and P65 gene and protein expression more than the other drug tested. PIs are known to play a significant role in cell death, therefore the cells were thought to be more susceptible to cell death following treatment with PIs. In conclusion, the drugs used, especially the PI showed some anticancer effects by facilitating cell death through decreased gene and protein expression of MUC1 and P65 and present promising agents for cancer treatment.

Keywords: cervical cancer, haart, MUC1, P65

Procedia PDF Downloads 326
16588 Influence of Temperature and Precipitation Changes on Desertification

Authors: Kukuri Tavartkiladze, Nana Bolashvili

Abstract:

The purpose of this paper was separation and study of the part of structure regime, which directly affects the process of desertification. A simple scheme was prepared for the assessment of desertification process; surface air temperature and precipitation for the years of 1936-2009 were analyzed.  The map of distribution of the Desertification Contributing Coefficient in the territory of Georgia was compiled. The simple scheme for identification of the intensity of the desertification contributing process has been developed and the illustrative example of its practical application for the territory of Georgia has been conducted.

Keywords: aridity, climate change, desertification, precipitation

Procedia PDF Downloads 332
16587 Focus on Sustainable Future of New Vernacular Architecture — Building "Vernacular Consciousness" in the New Ara

Authors: Ji Min China

Abstract:

The 20th century was the century of globalization. Developed transportation and the progress of information media made the earth into a global village. The differences between regions is increasingly reduced, "cultural convergence" phenomenon intensified, regional specialties and traditional culture has been eroded. In the field of architecture, while experienced orderly rational modernism baptism, it is increasingly recognized that set the expense of cultural differences and forced to follow the universal international-style building has been outdated. At the same time, in the 21st century environmental issues has been paid more and more attention, and the concept of sustainable development and sustainable building have been proposed.This makes the domestic and foreign architects began to explore the possibilities of building and reflect local cultural characteristics of the new vernacular architecture as a viable diversified architectural tendencies by domestic and foreign architects’ favor. The author will use the production and creative process of the new vernacular architecture at home and abroad as the background, and select some outstanding examples of the analysis and discussion, then reinterpret the "new vernacular architecture" in China now. This paper will pay more attention to how to master the true meaning of the here and now "new vernacular" as well as its multiple dimensions of sustainability in the future. It also determines the paper will be a two-way aspect and multi-dimensional understanding and mining of the "new vernacular".

Keywords: new vernacular architecture, regional culture, multi dimension, sustainable

Procedia PDF Downloads 443
16586 Spectrofluorimetric Investigation of Copper (II), Cobalt (II), Calcium (II), and Ferric (III) Influence on the Ciprofloxacin Binding to Bovine Serum Albumin

Authors: Ahmed K. Youssef, Shawkat M. B. Aly

Abstract:

The interaction between ciprofloxacin and bovine serum albumin (BSA) was investigated by UV-Visible absorption and fluorescence spectroscopy. The influence of Cu²⁺ Ca²⁺, Co²⁺, and Fe³⁺ on the Cip-BSA interaction was investigated. The quenching of the BSA fluorescence emission in presence of ciprofloxacin as well as the influence of metal ions on the interaction was analyzed using the Stern-Volmer equation. The Stern-Volmer quenching constant, Kₛᵥ was calculated in presence and absence of the metal ions at the physiological pH of 7.4 using phosphate buffer. The experimental results showed that interaction mainly static in nature and quenching rate constant is decreased in presence of the studied metal ions with exception of Cu²⁺ ions. The decrease observed in the Kₛᵥ values in presence of Co²⁺, Ca²⁺, and Fe³⁺ can be understood on basis of competition between these metal and Cip when both of them existed in the BSA solution. Cu²⁺ induces interaction between Cip and BSA at faster quenching rates as inferred from the observed increase in the Kₛᵥ value. This allowed us to propose that copper (II) ions are directly involved in the process of Cip binding to BSA. The binding constant for Cip on BSA was determined and the metal ions effect on it was examined as well and their values were in line with the Kₛᵥ values.

Keywords: bovine serum albumin, ciprofloxacin, fluorescence, metal ions effect

Procedia PDF Downloads 387
16585 Comparison of Sourcing Process in Supply Chain Operation References Model and Business Information Systems

Authors: Batuhan Kocaoglu

Abstract:

Although using powerful systems like ERP (Enterprise Resource Planning), companies still cannot benchmark their processes and measure their process performance easily based on predefined SCOR (Supply Chain Operation References) terms. The purpose of this research is to identify common and corresponding processes to present a conceptual model to model and measure the purchasing process of an organization. The main steps for the research study are: Literature review related to 'procure to pay' process in ERP system; Literature review related to 'sourcing' process in SCOR model; To develop a conceptual model integrating 'sourcing' of SCOR model and 'procure to pay' of ERP model. In this study, we examined the similarities and differences between these two models. The proposed framework is based on the assumptions that are drawn from (1) the body of literature, (2) the authors’ experience by working in the field of enterprise and logistics information systems. The modeling framework provides a structured and systematic way to model and decompose necessary information from conceptual representation to process element specification. This conceptual model will help the organizations to make quality purchasing system measurement instruments and tools. And offered adaptation issues for ERP systems and SCOR model will provide a more benchmarkable and worldwide standard business process.

Keywords: SCOR, ERP, procure to pay, sourcing, reference model

Procedia PDF Downloads 356
16584 A Survey on Intelligent Techniques Based Modelling of Size Enlargement Process for Fine Materials

Authors: Mohammad Nadeem, Haider Banka, R. Venugopal

Abstract:

Granulation or agglomeration is a size enlargement process to transform the fine particulates into larger aggregates since the fine size of available materials and minerals poses difficulty in their utilization. Though a long list of methods is available in the literature for the modeling of granulation process to facilitate the in-depth understanding and interpretation of the system, there is still scope of improvements using novel tools and techniques. Intelligent techniques, such as artificial neural network, fuzzy logic, self-organizing map, support vector machine and others, have emerged as compelling alternatives for dealing with imprecision and complex non-linearity of the systems. The present study tries to review the applications of intelligent techniques in the modeling of size enlargement process for fine materials.

Keywords: fine material, granulation, intelligent technique, modelling

Procedia PDF Downloads 367
16583 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception

Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu

Abstract:

Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.

Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish

Procedia PDF Downloads 136
16582 Single-Molecule Optical Study of Cholesterol-Mediated Dimerization Process of EGFRs in Different Cell Lines

Authors: Chien Y. Lin, Jung Y. Huang, Leu-Wei Lo

Abstract:

A growing body of data reveals that the membrane cholesterol molecules can alter the signaling pathways of living cells. However, the understanding about how membrane cholesterol modulates receptor proteins is still lacking. Single-molecule tracking can effectively probe into the microscopic environments and thermal fluctuations of receptor proteins in a living cell. In this study we applies single-molecule optical tracking on ligand-induced dimerization process of EGFRs in the plasma membranes of two cancer cell lines (HeLa and A431) and one normal endothelial cell line (MCF12A). We tracked individual EGFR and dual receptors, diffusing in a correlated manner in the plasma membranes of live cells. We developed an energetic model by integrating the generalized Langevin equation with the Cahn-Hilliard equation to help extracting important information from single-molecule trajectories. From the study, we discovered that ligand-bound EGFRs move from non-raft areas into lipid raft domains. This ligand-induced motion is a common behavior in both cancer and normal cells. By manipulating the total amount of membrane cholesterol with methyl-β-cyclodextrin and the local concentration of membrane cholesterol with nystatin, we further found that the amount of cholesterol can affect the stability of EGFR dimers. The EGFR dimers in the plasma membrane of normal cells are more sensitive to the local concentration changes of cholesterol than EGFR dimers in the cancer cells. Our method successfully captures dynamic interactions of receptors at the single-molecule level and provides insight into the functional architecture of both the diffusing EGFR molecules and their local cellular environment.

Keywords: membrane proteins, single-molecule tracking, Cahn-Hilliard equation, EGFR dimers

Procedia PDF Downloads 408
16581 An Architectural Model for APT Detection

Authors: Nam-Uk Kim, Sung-Hwan Kim, Tai-Myoung Chung

Abstract:

Typical security management systems are not suitable for detecting APT attack, because they cannot draw the big picture from trivial events of security solutions. Although SIEM solutions have security analysis engine for that, their security analysis mechanisms need to be verified in academic field. Although this paper proposes merely an architectural model for APT detection, we will keep studying on correlation analysis mechanism in the future.

Keywords: advanced persistent threat, anomaly detection, data mining

Procedia PDF Downloads 518
16580 Full-Scale 3D Simulation of the Electroslag Rapid Remelting Process

Authors: E. Karimi-Sibaki, A. Kharicha, M. Wu, A. Ludwig

Abstract:

The standard electroslag remelting (ESR) process can ideally control the solidification of an ingot and produce homogeneous structure with minimum defects. However, the melt rate of electrode is rather low that makes the whole process uneconomical especially to produce small ingot sizes. In contrast, continuous casting is an economical process to produce small ingots such as billets at high casting speed. Unfortunately, deep liquid melt pool forms in the billet ingot of continuous casting that leads to center porosity and segregation. As such, continuous casting is not suitable to produce segregation prone alloys like tool steel or several super alloys. On the other hand, the electro slag rapid remelting (ESRR) process has advantages of both traditional ESR and continuous casting processes to produce billets. In the ESRR process, a T-shaped mold is used including a graphite ring that takes major amount of current through the mold. There are only a few reports available in the literature discussing about this topic. The research on the ESRR process is currently ongoing aiming to improve the design of the T-shaped mold, to decrease overall heat loss in the process, and to obtain a higher temperature at metal meniscus. In the present study, a 3D model is proposed to investigate the electromagnetic, thermal, and flow fields in the whole process as well as solidification of the billet ingot. We performed a fully coupled numerical simulation to explore the influence of the electromagnetically driven flow (MHD) on the thermal field in the slag and ingot. The main goal is to obtain some fundamental understanding of the formation of melt pool of the solidifying billet ingot in the ESRR process.

Keywords: billet ingot, magnetohydrodynamics (mhd), numerical simulation, remelting, solidification, t-shaped mold.

Procedia PDF Downloads 290
16579 Valorization of Waste and By-products for Protein Extraction and Functional Properties

Authors: Lorena Coelho, David Ramada, Catarina Nobre, Joaquim Gaião, Juliana Duarte

Abstract:

The development of processes that allows the valorization of waste and by-products generated by industries is crucial to promote symbiotic relationships between different sectors and is mandatory to “close the loop” in the circular economy paradigm. In recent years, by-products and waste from agro-food and forestry sector have attracted attention due to their potential application and technical characteristics. The extraction of bio-based active compounds to be reused is in line with the circular bioeconomy concept trends, combining the use of renewable resources with the process’s circularity, aiming the waste reduction and encouraging reuse and recycling. Among different types of bio-based materials, which are being explored and can be extracted, proteins fractions are becoming an attractive new raw material. Within this context, BioTrace4Leather project, a collaboration between two Technological Centres – CeNTI and CTIC, and a company of Tanning and Finishing of Leather – Curtumes Aveneda, aims to develop innovative and biologically sustainable solutions for leather industry and accomplish the market circularity trends. Specifically, it aims to the valorisation of waste and by-products from the tannery industry through proteins extraction and the development of an innovative and biologically sustainable materials. The achieved results show that keratin, gelatine, and collagen fractions can be successfully extracted from hair and leather bovine waste. These products could be reintegrated into the industrial manufacturing process to attain innovative and functional textile and leather substrates. ACKNOWLEDGEMENT This work has been developed under BioTrace4Leather scope, a project co-funded by Operational Program for Competitiveness and Internationalization (COMPETE) of PORTUGAL2020, through the European Regional Development Fund (ERDF), under grant agreement Nº POCI-01-0247-FEDER-039867.

Keywords: leather by-products, circular economy, sustainability, protein fractions

Procedia PDF Downloads 148