Search results for: point process
18700 Reduction of Energy Consumption of Distillation Process by Recovering the Heat from Exit Streams
Authors: Apichit Svang-Ariyaskul, Thanapat Chaireongsirikul, Pawit Tangviroon
Abstract:
Distillation consumes enormous quantity of energy. This work proposed a process to recover the energy from exit streams during the distillation process of three consecutive columns. There are several novel techniques to recover the heat with the distillation system; however, a complex control system is required. This work proposed a simpler technique by exchanging the heat between streams without interrupting the internal distillation process that might cause a serious control problem. The proposed process is executed by using heat exchanger network with pinch analysis to maximize the process heat recovery. The test model is the distillation of butane, pentane, hexane, and heptanes, which is a common mixture in the petroleum refinery. This proposed process saved the energy consumption for hot and cold utilities of 29 and 27%, which is considered significant. Therefore, the recovery of heat from exit streams from distillation process is proved to be effective for energy saving.Keywords: distillation, heat exchanger, network pinch analysis, chemical engineering
Procedia PDF Downloads 36918699 Clustering and Modelling Electricity Conductors from 3D Point Clouds in Complex Real-World Environments
Authors: Rahul Paul, Peter Mctaggart, Luke Skinner
Abstract:
Maintaining public safety and network reliability are the core objectives of all electricity distributors globally. For many electricity distributors, managing vegetation clearances from their above ground assets (poles and conductors) is the most important and costly risk mitigation control employed to meet these objectives. Light Detection And Ranging (LiDAR) is widely used by utilities as a cost-effective method to inspect their spatially-distributed assets at scale, often captured using high powered LiDAR scanners attached to fixed wing or rotary aircraft. The resulting 3D point cloud model is used by these utilities to perform engineering grade measurements that guide the prioritisation of vegetation cutting programs. Advances in computer vision and machine-learning approaches are increasingly applied to increase automation and reduce inspection costs and time; however, real-world LiDAR capture variables (e.g., aircraft speed and height) create complexity, noise, and missing data, reducing the effectiveness of these approaches. This paper proposes a method for identifying each conductor from LiDAR data via clustering methods that can precisely reconstruct conductors in complex real-world configurations in the presence of high levels of noise. It proposes 3D catenary models for individual clusters fitted to the captured LiDAR data points using a least square method. An iterative learning process is used to identify potential conductor models between pole pairs. The proposed method identifies the optimum parameters of the catenary function and then fits the LiDAR points to reconstruct the conductors.Keywords: point cloud, LİDAR data, machine learning, computer vision, catenary curve, vegetation management, utility industry
Procedia PDF Downloads 9918698 A Geographical Study of Vindhyanchal in Mirzapur City, U.P. India
Authors: Akhilendra Nath Tiwary
Abstract:
Vindhyanchal is a very famous pilgrimage and tourism site in the west of Mirzapur city of Uttar Pradesh State in India. The city in east is a commercial center for cotton, metal ware and carpets. Among the Hindu population, it is believed that the primordial creative forces of the GOD and the power of the GODDESS make respective triangles which superimpose opposite to each other as hexagram at a point or node (Bindu (point) +Vasini (located) or Vindhyavasini, located in a point/node). Mirzapur city has served as a natural connecting point between north and south India. Before independence of India from Britain in 1947, it was a flourishing commercial center. Post-independence, the negligence of planning authorities and nexus of bureaucrats and politicians started affecting its development. In the meantime, emergence of new industrial cities as Kanpur, Agra, Moradabad, etc., nearer to the capital city of Delhi, posed serious challenges to the development of this small city as many commercial and business activities along with the skilled workforce started shifting to these new cities or to the relatively bigger neighboring cities of Varanasi in east and Allahabad in west. In the present paper, the significant causes, issues and challenges in development of Vindhyanchal is discussed with geographical perspective. An attempt has been made to find out the ways to restore the lost glory of the city as a center of pilgrimage, tourism, and commerce.Keywords: cultural node, pilgrimage, sacred, Vindhyan triangle, ommercial centre
Procedia PDF Downloads 44118697 Maximum Power Point Tracking for Small Scale Wind Turbine Using Multilayer Perceptron Neural Network Implementation without Mechanical Sensor
Authors: Piyangkun Kukutapan, Siridech Boonsang
Abstract:
The article proposes maximum power point tracking without mechanical sensor using Multilayer Perceptron Neural Network (MLPNN). The aim of article is to reduce the cost and complexity but still retain efficiency. The experimental is that duty cycle is generated maximum power, if it has suitable qualification. The measured data from DC generator, voltage (V), current (I), power (P), turnover rate of power (dP), and turnover rate of voltage (dV) are used as input for MLPNN model. The output of this model is duty cycle for driving the converter. The experiment implemented using Arduino Uno board. This diagram is compared to MPPT using MLPNN and P&O control (Perturbation and Observation control). The experimental results show that the proposed MLPNN based approach is more efficiency than P&O algorithm for this application.Keywords: maximum power point tracking, multilayer perceptron netural network, optimal duty cycle, DC generator
Procedia PDF Downloads 32518696 Improvement of Process Competitiveness Using Intelligent Reference Models
Authors: Julio Macedo
Abstract:
Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics
Procedia PDF Downloads 8718695 Qualitative Analysis of Bituminous Mix Modified by Polypropylene and Impact Characteristics on Pavement Wearing Course
Authors: Jayisha Das Jaya, Nafis As Sami, Nazia Jahan, Tamanna Jerin, Mohammed Russedul Islam
Abstract:
This paper contains continuous research which helps to analyze polypropylene modified bituminous mix and its impact characteristics with respect to original bitumen. Three percentages of polypropylene varying from (1-3) % of the weight of bitumen have been used to alter bitumen’s performance. The temperature of 170°C has been maintained during the blending of polypropylene with bitumen. It was performed by a wet process as it has certain advantages over the dry process. A rough estimate of 210 rpm rotation speed was set to prepare the blend in a mixer for 30 minutes producing homogeneous mixture. The blended mix shows a change in physical properties in comparison with the original bitumen content. Modification shows that for a 1% increment of polypropylene, softening point increases by 1 degree, penetration values decrease gradually to 55.6, 54, 52.5, ductility values decrease gradually to 87,76, 63 and specific gravity remains the same. Then Marshall mix design is performed with 60/70 penetration grade bitumen contents varying from (4-6) % with .5% intervals. Marshall stability and flow test results indicate the increase in stability and decrease in flow.Keywords: bitumen, marshall, polypropylene, temperature
Procedia PDF Downloads 24618694 The Impact of Governance Criteria in the Supplier Selection Process of Large German Companies
Authors: Christoph Köster
Abstract:
Supplier selection is one of the key challenges in supply chain management and can be considered a multi-criteria decision-making (MCDM) problem. In the 1960s, it evolved from considering only economic criteria, such as price, quality, and performance, to including environmental and social criteria nowadays. Although receiving considerable attention from scholars and practitioners over the past decades, existing research has not considered governance criteria so far. This is, however, surprising, as ESG (environmental, social, and governance) criteria have gained considerable attention. In order to complement ESG criteria in the supplier selection process, this study investigates German DAX and MDAX companies and evaluates the impact of governance criteria along their supplier selection process. Moreover, it proposes a set of criteria for the respective process steps. Specifically, eleven criteria for the first process step and five criteria for the second process step are identified. This paper contributes to a better understanding of the supplier selection process by elucidating the relevance of governance criteria in the supplier selection process and providing a set of empirically developed governance criteria. These results can be applied by practitioners to complement the criteria set in the supplier selection process and thus balance economic, environmental, social, and governance targets.Keywords: ESG, governance, sustainable supplier selection, sustainability
Procedia PDF Downloads 11818693 Development of a Multi-Factorial Instrument for Accident Analysis Based on Systemic Methods
Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu
Abstract:
The present research is built on three major pillars, commencing by making some considerations on accident investigation methods and pointing out both defining aspects and differences between linear and non-linear analysis. The traditional linear focus on accident analysis describes accidents as a sequence of events, while the latest systemic models outline interdependencies between different factors and define the processes evolution related to a specific (normal) situation. Linear and non-linear accident analysis methods have specific limitations, so the second point of interest is mirrored by the aim to discover the drawbacks of systemic models which becomes a starting point for developing new directions to identify risks or data closer to the cause of incidents/accidents. Since communication represents a critical issue in the interaction of human factor and has been proved to be the answer of the problems made by possible breakdowns in different communication procedures, from this focus point, on the third pylon a new error-modeling instrument suitable for risk assessment/accident analysis will be elaborated.Keywords: accident analysis, multi-factorial error modeling, risk, systemic methods
Procedia PDF Downloads 20818692 Public Functions of Kazakh Modern Literature
Authors: Erkingul Soltanaeva, Omyrkhan Abdimanuly, Alua Temirbolat
Abstract:
In this article, the public and social functions of literature and art in the Republic of Kazakhstan were analyzed on the basis of formal and informal literary organizations. The external and internal, subjective and objective factors which influenced the modern literary process were determined. The literary forces, their consolidation, types of organization in the art of word were examined. The periods of the literary process as planning, organization, promotion, and evaluation and their leading forces and approaches were analyzed. The right point of view to the language and mentality of the society force will influence to the literary process. The Ministry of Culture, the Writers' Union of RK and various non-governmental organizations are having different events for the promotion of literary process and to glorify literary personalities in the entire territory of Kazakhstan. According to the cultural plan of different state administration, there was a big program in order to publish their literary encyclopedia, to glorify and distribute books of own poets and writers of their region to the country. All of these official measures will increase the reader's interest in the book and will also bring up people to the patriotic education and improve the status of the native language. The professional literary publications such as the newspaper ‘Kazakh literature’, magazine ‘Zhuldyz’, and journal ‘Zhalyn’ materials which were published in the periods 2013-2015 on the basis of statistical analysis of the Kazakh literature topical to the issues and the field of themes are identified and their level of connection with the public situations are defined. The creative freedom, relations between society and the individual, the state of the literature, the problems of advantages and disadvantages were taken into consideration in the same articles. The level of functions was determined through the public role of literature, social feature, personal peculiarities. Now the stages as the literature management planning, organization, motivation, as well as the evaluation are forming and developing in Kazakhstan. But we still need the development of literature management to satisfy the actual requirements of the today’s agenda.Keywords: literature management, material, literary process, social functions
Procedia PDF Downloads 38418691 Students' ExperiEnce Enhancement Through Simulaton. A Process Flow in Logistics and Transportation Field
Authors: Nizamuddin Zainuddin, Adam Mohd Saifudin, Ahmad Yusni Bahaudin, Mohd Hanizan Zalazilah, Roslan Jamaluddin
Abstract:
Students’ enhanced experience through simulation is a crucial factor that brings reality to the classroom. The enhanced experience is all about developing, enriching and applications of a generic process flow in the field of logistics and transportations. As educational technology has improved, the effective use of simulations has greatly increased to the point where simulations should be considered a valuable, mainstream pedagogical tool. Additionally, in this era of ongoing (some say never-ending) assessment, simulations offer a rich resource for objective measurement and comparisons. Simulation is not just another in the long line of passing fads (or short-term opportunities) in educational technology. It is rather a real key to helping our students understand the world. It is a way for students to acquire experience about how things and systems in the world behave and react, without actually touching them. In short, it is about interactive pretending. Simulation is all about representing the real world which includes grasping the complex issues and solving intricate problems. Therefore, it is crucial before stimulate the real process of inbound and outbound logistics and transportation a generic process flow shall be developed. The paper will be focusing on the validization of the process flow by looking at the inputs gains from the sample. The sampling of the study focuses on multi-national and local manufacturing companies, third party companies (3PL) and government agency, which are selected in Peninsular Malaysia. A simulation flow chart was proposed in the study that will be the generic flow in logistics and transportation. A qualitative approach was mainly conducted to gather data in the study. It was found out from the study that the systems used in the process of outbound and inbound are System Application Products (SAP) and Material Requirement Planning (MRP). Furthermore there were some companies using Enterprises Resources Planning (ERP) and Electronic Data Interchange (EDI) as part of the Suppliers Own Inventories (SOI) networking as a result of globalized business between one countries to another. Computerized documentations and transactions were all mandatory requirement by the Royal Custom and Excise Department. The generic process flow will be the basis of developing a simulation program that shall be used in the classroom with the objective of further enhanced the students’ learning experience. Thus it will contributes to the body of knowledge on the enrichment of the student’s employability and also shall be one of the way to train new workers in the logistics and transportation filed.Keywords: enhancement, simulation, process flow, logistics, transportation
Procedia PDF Downloads 32918690 To Explore the Process of Entrepreneurial Opportunity in China Cultural and Creative Industries: From the Perspective of Institutional Theory
Authors: Jiaoya Huang, Jianghong Liu
Abstract:
This paper endeavors to comprehend and scrutinize the entrepreneurial development process within Chinese cultural and creative small and medium-sized enterprises (SMEs), as well as the factors that impinge on entrepreneurs' recognition and exploitation of entrepreneurial opportunities from the vantage point of institutional theory. The study is centered around three key research questions: namely, the drivers and impediments for entrepreneurs to identify opportunities within three prominent Chinese cultural and creative regions and the influence of institutional facets on the exploitation and recognition of opportunities within the cultural industry. Adopting a qualitative interpretivist research paradigm, a comparative multiple case study design is utilized. Semi-structured interviews will be carried out with founders and mid-level professionals of SMEs in Beijing, Shanghai, and Guangzhou, which are chosen in accordance with specific criteria. The data will be analyzed through an inductive thematic approach. Anticipatedly, this research will contribute to bridging the research gap in the nexus between institutional theory and entrepreneurial opportunities within the context of cultural and creative industries.Keywords: entrepreneurial opportunities, cultural and creative industries, institutional theory, Chinese SMEs
Procedia PDF Downloads 818689 A Two-Stage Process for the Sustainable Production of Aliphatic Polyesters
Authors: A. Douka, S. Vouyiouka, L. M. Papaspyridi, D. Korres, C. Papaspyrides
Abstract:
A "green" process was studied for the preparation of partially renewable aliphatic polyesters based on 1,4-butanediol and 1,8-octanediol with various diacids and derivatives, namely diethyl succinate, adipic acid, sebacic acid, 1,12-dodecanedioic acid and 1,14-tetradecanedioic acid. A first step of enzymatic prepolymerization was carried out in the presence of two different solvents, toluene and diphenylether, applying molecular sieves and vacuum, respectively, to remove polycondensation by-products. Poly(octylene adipate) (PE 8.6), poly(octylene dodecanate)(PE 8.12) and poly(octylene tetradecanate) (PE 8.14) were firstly enzymatically produced in toluene using molecular sieves giving however, low-molecular-weight products. Thereafter, the synthesis of PE 8.12 and PE 8.14 was examined under optimized conditions using diphenylether as solvent and a more vigorous by-product removal step, such as application of vacuum. Apart from these polyesters, the optimized process was also implemented for the production of another long-chain polyester-poly(octylene sebacate) (PE 8.10) and a short-chain polyester-poly(butylene succinate) (PE 4.4). Subsequently, bulk post-polymerization in the melt or solid state was performed. SSP runs involved absence of biocatalyst and reaction temperatures (T) in the vicinity of the prepolymer melting point (Tm-T varied between 15.5 up to 4oC). Focusing on PE 4.4 and PE 8.12, SSP took place under vacuum or flowing nitrogen leading to increase of the molecular weight and improvement of the end product physical appearance and thermal properties.Keywords: aliphatic polyester, enzymatic polymerization, solid state polymerization, Novozym 435
Procedia PDF Downloads 32418688 Numerical Investigation of Solid Subcooling on a Low Melting Point Metal in Latent Thermal Energy Storage Systems Based on Flat Slab Configuration
Authors: Cleyton S. Stampa
Abstract:
This paper addresses the perspectives of using low melting point metals (LMPMs) as phase change materials (PCMs) in latent thermal energy storage (LTES) units, through a numerical approach. This is a new class of PCMs that has been one of the most prospective alternatives to be considered in LTES, due to these materials present high thermal conductivity and elevated heat of fusion, per unit volume. The chosen type of LTES consists of several horizontal parallel slabs filled with PCM. The heat transfer fluid (HTF) circulates through the channel formed between each two consecutive slabs on a laminar regime through forced convection. The study deals with the LTES charging process (heat-storing) by using pure gallium as PCM, and it considers heat conduction in the solid phase during melting driven by natural convection in the melt. The transient heat transfer problem is analyzed in one arbitrary slab under the influence of the HTF. The mathematical model to simulate the isothermal phase change is based on a volume-averaged enthalpy method, which is successfully verified by comparing its predictions with experimental data from works available in the pertinent literature. Regarding the convective heat transfer problem in the HTF, it is assumed that the flow is thermally developing, whereas the velocity profile is already fully developed. The study aims to learn about the effect of the solid subcooling in the melting rate through comparisons with the melting process of the solid in which it starts to melt from its fusion temperature. In order to best understand this effect in a metallic compound, as it is the case of pure gallium, the study also evaluates under the same conditions established for the gallium, the melting process of commercial paraffin wax (organic compound) and of the calcium chloride hexahydrate (CaCl₂ 6H₂O-inorganic compound). In the present work, it is adopted the best options that have been established by several researchers in their parametric studies with respect to this type of LTES, which lead to high values of thermal efficiency. To do so, concerning with the geometric aspects, one considers a gap of the channel formed by two consecutive slabs, thickness and length of the slab. About the HTF, one considers the type of fluid, the mass flow rate, and inlet temperature.Keywords: flat slab, heat storing, pure metal, solid subcooling
Procedia PDF Downloads 14118687 A Review of Applying Serious Games on Learning
Authors: Carlos Oliveira, Ulrick Pimentel
Abstract:
Digital games have conquered a growing space in the lives of children, adolescents and adults. In this perspective, the use of this resource has shown to be an important strategy that facilitates the learning process. This research is a literature review on the use of serious games in teaching, which shows the characteristics of these games, the benefits and possible harms that this resource can produce, in addition to the possible methods of evaluating the effectiveness of this resource in teaching. The results point out that Serious Games have significant potential as a tool for instruction. However, their effectiveness in terms of learning outcomes is still poorly studied, mainly due to the complexity involved in evaluating intangible measures.Keywords: serious games, learning, application, literature review
Procedia PDF Downloads 30918686 Biases in Macroprudential Supervision and Their Legal Implications
Authors: Anat Keller
Abstract:
Given that macro-prudential supervision is a relatively new policy area and its empirical and analytical research are still in their infancy, its theoretical foundations are also lagging behind. This paper contributes to the developing discussion on effective legal and institutional macroprudential supervision frameworks. In the first part of the paper, it is argued that effectiveness as a key benchmark poses some challenges in the context of macroprudential supervision such as the difficulty in proving causality between supervisory actions and the achievement of the supervisor’s mission. The paper suggests that effectiveness in the macroprudential context should, therefore, be assessed at the supervisory decision-making process (to be differentiated from the supervisory outcomes). The second part of the essay examines whether insights from behavioural economics can point to biases in the macroprudential decision-making process. These biases include, inter alia, preference bias, groupthink bias and inaction bias. It is argued that these biases are exacerbated in the multilateral setting of the macroprudential supervision framework in the EU. The paper then examines how legal and institutional frameworks should be designed to acknowledge and perhaps contain these identified biases. The paper suggests that the effectiveness of macroprudential policy will largely depend on the existence of clear and robust transparency and accountability arrangements. Accountability arrangements can be used as a vehicle for identifying and addressing potential biases in the macro-prudential framework, in particular, inaction bias. Inclusiveness of the public in the supervisory process in the form of transparency and awareness of the logic behind policy decisions may assist in minimising their potential unpopularity thus promoting their effectiveness. Furthermore, a governance structure which facilitates coordination of the macroprudential supervisor with other policymakers and incorporates outside perspectives and opinions could ‘break-down’ groupthink bias as well as inaction bias.Keywords: behavioural economics and biases, effectiveness of macroprudential supervision, legal and institutional macroprudential frameworks, macroprudential decision-making process
Procedia PDF Downloads 28018685 Numerical Simulation of Filtration Gas Combustion: Front Propagation Velocity
Authors: Yuri Laevsky, Tatyana Nosova
Abstract:
The phenomenon of filtration gas combustion (FGC) had been discovered experimentally at the beginning of 80’s of the previous century. It has a number of important applications in such areas as chemical technologies, fire-explosion safety, energy-saving technologies, oil production. From the physical point of view, FGC may be defined as the propagation of region of gaseous exothermic reaction in chemically inert porous medium, as the gaseous reactants seep into the region of chemical transformation. The movement of the combustion front has different modes, and this investigation is focused on the low-velocity regime. The main characteristic of the process is the velocity of the combustion front propagation. Computation of this characteristic encounters substantial difficulties because of the strong heterogeneity of the process. The mathematical model of FGC is formed by the energy conservation laws for the temperature of the porous medium and the temperature of gas and the mass conservation law for the relative concentration of the reacting component of the gas mixture. In this case the homogenization of the model is performed with the use of the two-temperature approach when at each point of the continuous medium we specify the solid and gas phases with a Newtonian heat exchange between them. The construction of a computational scheme is based on the principles of mixed finite element method with the usage of a regular mesh. The approximation in time is performed by an explicit–implicit difference scheme. Special attention was given to determination of the combustion front propagation velocity. Straight computation of the velocity as grid derivative leads to extremely unstable algorithm. It is worth to note that the term ‘front propagation velocity’ makes sense for settled motion when some analytical formulae linking velocity and equilibrium temperature are correct. The numerical implementation of one of such formulae leading to the stable computation of instantaneous front velocity has been proposed. The algorithm obtained has been applied in subsequent numerical investigation of the FGC process. This way the dependence of the main characteristics of the process on various physical parameters has been studied. In particular, the influence of the combustible gas mixture consumption on the front propagation velocity has been investigated. It also has been reaffirmed numerically that there is an interval of critical values of the interfacial heat transfer coefficient at which a sort of a breakdown occurs from a slow combustion front propagation to a rapid one. Approximate boundaries of such an interval have been calculated for some specific parameters. All the results obtained are in full agreement with both experimental and theoretical data, confirming the adequacy of the model and the algorithm constructed. The presence of stable techniques to calculate the instantaneous velocity of the combustion wave allows considering the semi-Lagrangian approach to the solution of the problem.Keywords: filtration gas combustion, low-velocity regime, mixed finite element method, numerical simulation
Procedia PDF Downloads 30118684 Improving the Efficiency of Repacking Process with Lean Technique: The Study of Read With Me Group Company Limited
Authors: Jirayut Phetchuen, Jongkol Srithorn
Abstract:
The study examines the unloading and repacking process of Read With Me Group Company Limited. The research aims to improve the old work process and build a new efficient process with the Lean Technique and new machines for faster delivery without increasing the number of employees. Currently, two employees work based on five days on and off. However, workplace injuries have delayed the delivery time, especially the delivery to the neighboring countries. After the process improvement, the working space increased by 25%, the Process Lead Time decreased by 40%, the work efficiency increased by 175.82%, and the work injuries rate was reduced to zero.Keywords: lean technique, plant layout design, U-shaped disassembly line, value stream mapping
Procedia PDF Downloads 10418683 Linking Business Process Models and System Models Based on Business Process Modelling
Authors: Faisal A. Aburub
Abstract:
Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.Keywords: business process modelling, system models, role activity diagrams, sequence diagrams
Procedia PDF Downloads 38418682 Critical Thinking Index of College Students
Authors: Helen Frialde-Dupale
Abstract:
Critical thinking Index (CTI) of 150 third year college students from five State Colleges and Universities (SUCs) in Region I were determined. Only students with Grade Point Average (GPA) of at least 2.0 from four general classification of degree courses, namely: Education, Arts and Sciences, Engineering and Agriculture were included. Specific problem No.1 dealt with the profile variables, namely: age, sex, degree course, monthly family income, number of siblings, high school graduated from, grade point average, personality type, highest educational attainment of parents, and occupation of parents. Problem No. 2 determined the critical thinking index among the respondents. Problem No. 3 investigated whether or not there are significant differences in the critical thinking index among the respondents across the profile variables. While problem No.4 determined whether or not there are significant relationship between the critical thinking index and selected profile variables, namely: age, monthly family income, number of siblings, and grade point average of the respondents. Finally, on problem No. 5, the critical thinking instrument which obtained the lowest rates, were used as basis for outlining an intervention program for enhancing critical thinking index (CTI) of students. The following null hypotheses were tested at 0.05 level of significance: there are no significant differences in the critical thinking index of the third college students across the profile variables; there are no significant relationships between the critical thinking index of the respondents and selected variables, namely: age, monthly family income, number of siblings, and grade point average.Keywords: attitude as critical thinker, critical thinking applied, critical thinking index, self-perception as critical thinker
Procedia PDF Downloads 51718681 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 25218680 Unified Structured Process for Health Analytics
Authors: Supunmali Ahangama, Danny Chiang Choon Poo
Abstract:
Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.Keywords: agile methodology, health analytics, unified process model, UML
Procedia PDF Downloads 50618679 Deep Learning Approach for Colorectal Cancer’s Automatic Tumor Grading on Whole Slide Images
Authors: Shenlun Chen, Leonard Wee
Abstract:
Tumor grading is an essential reference for colorectal cancer (CRC) staging and survival prognostication. The widely used World Health Organization (WHO) grading system defines histological grade of CRC adenocarcinoma based on the density of glandular formation on whole slide images (WSI). Tumors are classified as well-, moderately-, poorly- or un-differentiated depending on the percentage of the tumor that is gland forming; >95%, 50-95%, 5-50% and <5%, respectively. However, manually grading WSIs is a time-consuming process and can cause observer error due to subjective judgment and unnoticed regions. Furthermore, pathologists’ grading is usually coarse while a finer and continuous differentiation grade may help to stratifying CRC patients better. In this study, a deep learning based automatic differentiation grading algorithm was developed and evaluated by survival analysis. Firstly, a gland segmentation model was developed for segmenting gland structures. Gland regions of WSIs were delineated and used for differentiation annotating. Tumor regions were annotated by experienced pathologists into high-, medium-, low-differentiation and normal tissue, which correspond to tumor with clear-, unclear-, no-gland structure and non-tumor, respectively. Then a differentiation prediction model was developed on these human annotations. Finally, all enrolled WSIs were processed by gland segmentation model and differentiation prediction model. The differentiation grade can be calculated by deep learning models’ prediction of tumor regions and tumor differentiation status according to WHO’s defines. If multiple WSIs were possessed by a patient, the highest differentiation grade was chosen. Additionally, the differentiation grade was normalized into scale between 0 to 1. The Cancer Genome Atlas, project COAD (TCGA-COAD) project was enrolled into this study. For the gland segmentation model, receiver operating characteristic (ROC) reached 0.981 and accuracy reached 0.932 in validation set. For the differentiation prediction model, ROC reached 0.983, 0.963, 0.963, 0.981 and accuracy reached 0.880, 0.923, 0.668, 0.881 for groups of low-, medium-, high-differentiation and normal tissue in validation set. Four hundred and one patients were selected after removing WSIs without gland regions and patients without follow up data. The concordance index reached to 0.609. Optimized cut off point of 51% was found by “Maxstat” method which was almost the same as WHO system’s cut off point of 50%. Both WHO system’s cut off point and optimized cut off point performed impressively in Kaplan-Meier curves and both p value of logrank test were below 0.005. In this study, gland structure of WSIs and differentiation status of tumor regions were proven to be predictable through deep leaning method. A finer and continuous differentiation grade can also be automatically calculated through above models. The differentiation grade was proven to stratify CAC patients well in survival analysis, whose optimized cut off point was almost the same as WHO tumor grading system. The tool of automatically calculating differentiation grade may show potential in field of therapy decision making and personalized treatment.Keywords: colorectal cancer, differentiation, survival analysis, tumor grading
Procedia PDF Downloads 13418678 Online Learning for Modern Business Models: Theoretical Considerations and Algorithms
Authors: Marian Sorin Ionescu, Olivia Negoita, Cosmin Dobrin
Abstract:
This scientific communication reports and discusses learning models adaptable to modern business problems and models specific to digital concepts and paradigms. In the PAC (probably approximately correct) learning model approach, in which the learning process begins by receiving a batch of learning examples, the set of learning processes is used to acquire a hypothesis, and when the learning process is fully used, this hypothesis is used in the prediction of new operational examples. For complex business models, a lot of models should be introduced and evaluated to estimate the induced results so that the totality of the results are used to develop a predictive rule, which anticipates the choice of new models. In opposition, for online learning-type processes, there is no separation between the learning (training) and predictive phase. Every time a business model is approached, a test example is considered from the beginning until the prediction of the appearance of a model considered correct from the point of view of the business decision. After choosing choice a part of the business model, the label with the logical value "true" is known. Some of the business models are used as examples of learning (training), which helps to improve the prediction mechanisms for future business models.Keywords: machine learning, business models, convex analysis, online learning
Procedia PDF Downloads 14018677 Model-Based Process Development for the Comparison of a Radial Riveting and Roller Burnishing Process in Mechanical Joining Technology
Authors: Tobias Beyer, Christoph Friedrich
Abstract:
Modern simulation methodology using finite element models is nowadays a recognized tool for product design/optimization. Likewise, manufacturing process design is increasingly becoming the focus of simulation methodology in order to enable sustainable results based on reduced real-life tests here as well. In this article, two process simulations -radial riveting and roller burnishing- used for mechanical joining of components are explained. In the first step, the required boundary conditions are developed and implemented in the respective simulation models. This is followed by process space validation. With the help of the validated models, the interdependencies of the input parameters are investigated and evaluated by means of sensitivity analyses. Limit case investigations are carried out and evaluated with the aid of the process simulations. Likewise, a comparison of the two joining methods to each other becomes possible.Keywords: FEM, model-based process development, process simulation, radial riveting, roller burnishing, sensitivity analysis
Procedia PDF Downloads 10818676 Prediction of Ionic Liquid Densities Using a Corresponding State Correlation
Authors: Khashayar Nasrifar
Abstract:
Ionic liquids (ILs) exhibit particular properties exemplified by extremely low vapor pressure and high thermal stability. The properties of ILs can be tailored by proper selection of cations and anions. As such, ILs are appealing as potential solvents to substitute traditional solvents with high vapor pressure. One of the IL properties required in chemical and process design is density. In developing corresponding state liquid density correlations, scaling hypothesis is often used. The hypothesis expresses the temperature dependence of saturated liquid densities near the vapor-liquid critical point as a function of reduced temperature. Extending the temperature dependence, several successful correlations were developed to accurately correlate the densities of normal liquids from the triple point to a critical point. Applying mixing rules, the liquid density correlations are extended to liquid mixtures as well. ILs are not molecular liquids, and they are not classified among normal liquids either. Also, ILs are often used where the condition is far from equilibrium. Nevertheless, in calculating the properties of ILs, the use of corresponding state correlations would be useful if no experimental data were available. With well-known generalized saturated liquid density correlations, the accuracy in predicting the density of ILs is not that good. An average error of 4-5% should be expected. In this work, a data bank was compiled. A simplified and concise corresponding state saturated liquid density correlation is proposed by phenomena-logically modifying reduced temperature using the temperature-dependence for an interacting parameter of the Soave-Redlich-Kwong equation of state. This modification improves the temperature dependence of the developed correlation. Parametrization was next performed to optimize the three global parameters of the correlation. The correlation was then applied to the ILs in our data bank with satisfactory predictions. The correlation of IL density applied at 0.1 MPa and was tested with an average uncertainty of around 2%. No adjustable parameter was used. The critical temperature, critical volume, and acentric factor were all required. Methods to extend the predictions to higher pressures (200 MPa) were also devised. Compared to other methods, this correlation was found more accurate. This work also presents the chronological order of developing such correlations dealing with ILs. The pros and cons are also expressed.Keywords: correlation, corresponding state principle, ionic liquid, density
Procedia PDF Downloads 12718675 The LMPA/Epoxy Mixture Encapsulation of OLED on Polyimide Substrate
Authors: Chuyi Ye, Minsang Kim, Cheol-Hee Moon
Abstract:
The organic light emitting diode(OLED), is a potential organic optical functional materials which is considered as the next generation display technology with the advantages such as all-solid state, ultra-thin thickness, active luminous and flexibility. Due to the development of polymer-inorganic substrate, it becomes possible to achieve the flexible OLED display. However the organic light-emitting material is very sensitive to the oxygen and water vapor, and the encapsulation requires water vapor transmission rate(WVTR) and oxygen transmission rate(OTR) as lower as 10-6 g/(m2.d) and 10-5 cm3/(m2.d) respectively. In current situation, the rigorous WVTR and OTR have restricted the application of the OLED display. Traditional epoxy/getter or glass frit approaches, which have been widely applied on glass-substrate-based devices, are not suitable for transparent flexible organic devices, and mechanically flexible thin-film approaches are required. To ensure the OLED’s lifetime, the encapsulation material of the OLED package is very important. In this paper, a low melting point alloy(LMPA)-epoxy mixture in the encapsulation process is introduced. There will be a phase separation when the mixture is heated to the melting of LMPA and the formation of the double line structure between two substrates: the alloy barrier has extremely low WVTR and OTR and the epoxy fills the potential tiny cracks. In our experiment, the PI film is chosen as a flexible transparent substrate, and Mo and Cu are deposited on the PI film successively. Then the two metal layers are photolithographied to the sealing pattern line. The Mo is a transition layer between the PI film and Cu, at the same time, the Cu has a good wettability with the LMPA(Sn-58Bi). At last, pattern is printed with LMPA layer and applied voltage, the gathering Joule heat melt the LMPA and form the double line structure and the OLED package is sealed in the same time. In this research, the double-line encapsulating structure of LMPA and epoxy on the PI film is manufactured for the flexible OLED encapsulation, and in this process it is investigated whether the encapsulation satisfies the requirement of WVTR and OTR for the flexible OLED.Keywords: encapsulation, flexible, low melting point alloy, OLED
Procedia PDF Downloads 59818674 Cadmium Adsorption by Modified Magnetic Biochar
Authors: Chompoonut Chaiyaraksa, Chanida Singbubpha, Kliaothong Angkabkingkaew, Thitikorn Boonyasawin
Abstract:
Heavy metal contamination in an environment is an important problem in Thailand that needs to be addressed urgently, particularly contaminated with water. It can spread to other environments faster. This research aims to study the adsorption of cadmium ion by unmodified biochar and sodium dodecyl sulfate modified magnetic biochar derived from Eichhornia Crassipes. The determination of the adsorbent characteristics was by Scanning Electron Microscope, Fourier Transform Infrared Spectrometer, X-ray Diffractometer, and the pH drift method. This study also included the comparison of adsorption efficiency of both types of biochar, adsorption isotherms, and kinetics. The pH value at the point of zero charges of the unmodified biochar and modified magnetic biochar was 7.40 and 3.00, respectively. The maximum value of adsorption reached when using pH 8. The equilibrium adsorption time was 5 hours and 1 hour for unmodified biochar and modified magnetic biochar, respectively. The cadmium adsorption by both adsorbents followed Freundlich, Temkin, and Dubinin – Radushkevich isotherm model and the pseudo-second-order kinetic. The adsorption process was spontaneous at high temperatures and non-spontaneous at low temperatures. It was an endothermic process, physisorption in nature, and can occur naturally.Keywords: Eichhornia crassipes, magnetic biochar, sodium dodecyl sulfate, water treatment
Procedia PDF Downloads 17218673 A Randomised Controlled Trial and Process Evaluation of the Lifestart Parenting Programme
Authors: Sharon Millen, Sarah Miller, Laura Dunne, Clare McGeady, Laura Neeson
Abstract:
This paper presents the findings from a randomised controlled trial (RCT) and process evaluation of the Lifestart parenting programme. Lifestart is a structured child-centred programme of information and practical activity for parents of children aged from birth to five years of age. It is delivered to parents in their own homes by trained, paid family visitors and it is offered to parents regardless of their social, economic or other circumstances. The RCT evaluated the effectiveness of the programme and the process evaluation documented programme delivery and included a qualitative exploration of parent and child outcomes. 424 parents and children participated in the RCT: 216 in the intervention group and 208 in the control group across the island of Ireland. Parent outcomes included: parental knowledge of child development, parental efficacy, stress, social support, parenting skills and embeddedness in the community. Child outcomes included cognitive, language and motor development and social-emotional and behavioural development. Both groups were tested at baseline (when children were less than 1 year old), mid-point (aged 3) and at post-test (aged 5). Data were collected during a home visit, which took two hours. The process evaluation consisted of interviews with parents (n=16 at baseline and end-point), and focus groups with Lifestart Coordinators (n=9) and Family Visitors (n=24). Quantitative findings from the RCT indicated that, compared to the control group, parents who received the Lifestart programme reported reduced parenting-related stress, increased knowledge of their child’s development, and improved confidence in their parenting role. These changes were statistically significant and consistent with the hypothesised pathway of change depicted in the logic model. There was no evidence of any change in parents’ embeddedness in the community. Although four of the five child outcomes showed small positive change for children who took part in the programme, these were not statistically significant and there is no evidence that the programme improves child cognitive and non-cognitive skills by immediate post-test. The qualitative process evaluation highlighted important challenges related to conducting trials of this magnitude and design in the general population. Parents reported that a key incentive to take part in study was receiving feedback from the developmental assessment, which formed part of the data collection. This highlights the potential importance of appropriate incentives in relation to recruitment and retention of participants. The interviews with intervention parents indicated that one of the first changes they experienced as a result of the Lifestart programme was increased knowledge and confidence in their parenting ability. The outcomes and pathways perceived by parents and described in the interviews are also consistent with the findings of the RCT and the theory of change underpinning the programme. This hypothesises that improvement in parental outcomes, arising as a consequence of the programme, mediate the change in child outcomes. Parents receiving the Lifestart programme reported great satisfaction with and commitment to the programme, with the role of the Family Visitor being identified as one of the key components of the programme.Keywords: parent-child relationship, parental self-efficacy, parental stress, school readiness
Procedia PDF Downloads 44418672 Transport and Mixing Phenomena Developed by Vortex Formation in Flow around Airfoil Using Lagrangian Coherent Structures
Authors: Riaz Ahmad, Jiazhong Zhang, Asma Farooqi
Abstract:
In this study, mass transport between separation bubbles and the flow around a two-dimensional airfoil are numerically investigated using Lagrangian Coherent Structures (LCSs). Finite Time Lyapunov Exponent (FTLE) technique is used for the computation to identify invariant manifolds and LCSs. Moreover, the Characteristic Base Split (CBS) scheme combined with dual time stepping technique is applied to simulate such transient flow at low Reynolds number. We then investigate the evolution of vortex structures during the transport process with the aid of LCSs. To explore the vortex formation at the surface of the airfoil, the dynamics of separatrix is also taken into account which is formed by the combination of stable-unstable manifolds. The Lagrangian analysis gives a detailed understanding of vortex dynamics and separation bubbles which plays a significant role to explore the performance of the unsteady flow generated by the airfoil. Transport process and flow separation phenomena are studied extensively to analyze the flow pattern by Lagrangian point of view.Keywords: transport phenomena, CBS Method, vortex formation, Lagrangian Coherent Structures
Procedia PDF Downloads 13918671 Effect of Tube Backward Extrusion (TBE) Process on the Microstructure and Mechanical Properties of AZ31 Magnesium Alloy
Authors: H. Abdolvand, M. Riazat, H. Sohrabi, G. Faraji
Abstract:
An experimental investigation into the Tube Backward Extrusion (TBE) process on AZ31 magnesium alloy is studied. Microstructures and grain size distribution of the specimens before and after TBE process are investigated by optical microscopy. Tensile and Vickers microhardness tests along extrusion direction were performed at room temperature. It is found that the average grain size is refined remarkably from the initial 33 µm down to 3.5 µm after TBE process. Also, the microhardness increased significantly to 58 HV after the process from an initial value of 36 HV.Keywords: tube backward extrusion, AZ31, grain size distribution, grain refinement
Procedia PDF Downloads 499