Search results for: process discovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15306

Search results for: process discovery

15096 Optimised Path Recommendation for a Real Time Process

Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa

Abstract:

Traditional execution process follows the path of execution drawn by the process analyst without observing the behaviour of resource and other real-time constraints. Identifying process model, predicting the behaviour of resource and recommending the optimal path of execution for a real time process is challenging. The proposed AlfyMiner: αyM iner gives a new dimension in process execution with the novel techniques Process Model Analyser: PMAMiner and Resource behaviour Analyser: RBAMiner for recommending the probable path of execution. PMAMiner discovers next probable activity for currently executing activity in an online process using variant matching technique to identify the set of next probable activity, among which the next probable activity is discovered using decision tree model. RBAMiner identifies the resource suitable for performing the discovered next probable activity and observe the behaviour based on; load and performance using polynomial regression model, and waiting time using queueing theory. Based on the observed behaviour αyM iner recommend the probable path of execution with; next probable activity and the best suitable resource for performing it. Experiments were conducted on process logs of CoSeLoG Project1 and 72% of accuracy is obtained in identifying and recommending next probable activity and the efficiency of resource performance was optimised by 59% by decreasing their load.

Keywords: cross-organization process mining, process behaviour, path of execution, polynomial regression model

Procedia PDF Downloads 311
15095 Effect of Impurities in the Chlorination Process of TiO2

Authors: Seok Hong Min, Tae Kwon Ha

Abstract:

With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.

Keywords: rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage

Procedia PDF Downloads 284
15094 Controlling the Process of a Chicken Dressing Plant through Statistical Process Control

Authors: Jasper Kevin C. Dionisio, Denise Mae M. Unsay

Abstract:

In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.

Keywords: motion and time study, process controlling, statistical process control, X-Bar R Control chart

Procedia PDF Downloads 187
15093 Applying EzRAD Method for SNPs Discovery in Population Genetics of Freshwater and Marine Fish in the South of Vietnam

Authors: Quyen Vu Dang Ha, Oanh Truong Thi, Thuoc Tran Linh, Kent Carpenter, Thinh Doan Vu, Binh Dang Thuy

Abstract:

Enzyme restriction site associated DNA (EzRAD) has recently emerged as a promising genomic approach for exploring fish genetic diversity on a genome-wide scale. This is a simplified method for genomic genotyping in non-model organisms and applied for SNPs discovery in the population genetics of freshwater and marine fish in the South of Vietnam. The observations of regional-scale differentiation of commercial freshwater fish (smallscale croakers Boesemania microlepis) and marine fish (emperor Lethrinus lentjan) are clarified. Samples were collected along Hau River and coastal area in the south and center Vietnam. 52 DNA samples from Tra Vinh, An Giang Province for Boesemania microlepis and 34 DNA samples of Lethrinus lentjan from Phu Quoc, Nha Trang, Da Nang Province were used to prepare EzRAD libraries from genomic DNA digested with MboI and Sau3AI. A pooled sample of regional EzRAD libraries was sequenced using the HiSeq 2500 Illumina platform. For Boesemania microlepis, the small scale population different from upstream to downstream of Hau river were detected, An Giang population exhibited less genetic diversity (SNPs per individual from 14 to 926), in comparison to Tra Vinh population (from 11 to 2172). For Lethrinus lentjan, the result showed the minor difference between populations in the Northern and the Southern Mekong River. The numbers of contigs and SNPs vary from 1315 to 2455 and from 7122 to 8594, respectively (P ≤ 0.01). The current preliminary study reveals regional scale population disconnection probably reflecting environmental changing. Additional sampling and EzRad libraries need to be implemented for resource management in the Mekong Delta.

Keywords: Boesemania microlepis, EzRAD, Lethrinus lentjan, SNPs

Procedia PDF Downloads 477
15092 Reclaiming the Lost Jewish Identity of a Second Generation Holocaust Survivor Raised as a Christian: The Role of Art and Art Therapy

Authors: Bambi Ward

Abstract:

Children of Holocaust survivors have been described as inheriting their parents’ trauma as a result of ‘vicarious memory’. The term refers to a process whereby second generation Holocaust survivors subconsciously remember aspects of Holocaust trauma, despite not having directly experienced it. This can occur even when there has been a conspiracy of silence in which survivors chose not to discuss the Holocaust with their children. There are still people born in various parts of the world such as Poland, Hungary, other parts of Europe, USA, Canada and Australia, who have only learnt of their Jewish roots as adults. This discovery may occur during a parent’s deathbed confession, or when an adult child is sorting through the personal belongings of a deceased family member. Some Holocaust survivors chose to deny their Jewish heritage and raise their children as Christians. Reasons for this decision include the trauma experienced during the Holocaust for simply being Jewish, the existence of anti-Semitism, and the desire to protect one’s self and one’s family. Although there has been considerable literature written about the transgenerational impact of trauma on children of Holocaust survivors, there has been little scholarly investigation into the effects of a hidden Jewish identity on these children. This paper presents a case study of an adult child of Hungarian Holocaust survivors who was raised as a Christian. At the age of eight she was told about her family’s Jewish background, but her parents insisted that she keep this a secret, even if asked directly. She honoured their request until she turned forty. By that time she had started the challenging process of reclaiming her Jewish identity. The paper outlines the tension between family loyalty and individual freedom, and discusses the role that art and art therapy played in assisting the subject of the case study to reclaim her Jewish identity and commence writing a memoir about her spiritual journey. The main methodology used in this case study is creative practice-led research. Particular attention is paid to the utilisation of an autoethnographic approach. The autoethnographic tools used include reflective journals of the subject of the case study. These journals reflect on the subject’s collection of autobiographical data relating to her family history, and include memories, drawings, products of art therapy, diaries, letters, photographs, home movies, objects, and oral history interviews with her mother. The case study illustrates how art and art therapy benefitted a second generation Holocaust survivor who was brought up having to suppress her Jewish identity. The process allowed her to express subconscious thoughts and feelings about her identity and free herself from the burden of the long term secret she had been carrying. The process described may also be of assistance to other traumatised people who have been trying to break the silence and who are seeking to express themselves in a positive and healing way.

Keywords: art, hidden identity, holocaust, silence

Procedia PDF Downloads 218
15091 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 501
15090 Research Opportunities in Business Process Management and Performance Measurement from a Constructivist View

Authors: R.T.O. Lacerda, L. Ensslin., S.R. Ensslin, L. Knoff

Abstract:

This research paper aims to discover research opportunities in business process management and performance measurement from a constructivist view. The nature of this research is exploratory and descriptive and the research method was performed in a qualitative way. The process narrowed down 2142 articles, gathered after a search in scientific databases, and identified 16 articles that were relevant to the research and highly cited. The analysis found that most of the articles uses realistic approach and there is a need to analyze the decision making process in a singular manner. The measurement criteria are identified from scientific literature searching, in most cases, using ordinal scale without any integration process to present the results to the decision maker. Regarding management aspects, most of the articles do not have a structured process to measure the current situation and generate improvements opportunities.

Keywords: performance measurement, BPM, decision, research opportunities

Procedia PDF Downloads 290
15089 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis

Authors: Hyun-Woo Cho

Abstract:

Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.

Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques

Procedia PDF Downloads 362
15088 The Search of Possibility of Running Six Sigma Process in It Education Center

Authors: Mohammad Amini, Aliakbar Alijarahi

Abstract:

This research that is collected and title as ‘ the search of possibility of running six sigma process in IT education center ‘ goals to test possibility of running the six sigma process and using in IT education center system. This process is a good method that is used for reducing process, errors. To evaluate running off six sigma in the IT education center, some variables relevant to this process is selected. These variables are: - The amount of support from organization master boss to process. - The current specialty. - The ability of training system for compensating reduction. - The amount of match between current culture whit six sigma culture . - The amount of current quality by comparing whit quality gain from running six sigma. For evaluation these variables we select four question and to gain the answers, we set a questionnaire from with 28 question and distribute it in our typical society. Since, our working environment is a very competition, and organization needs to decree the errors to minimum, otherwise it lasts their customers. The questionnaire from is given to 55 persons, they were filled and returned by 50 persons, after analyzing the forms these results is gained: - IT education center needs to use and run this system (six sigma) for improving their process qualities. - The most factors need to run the six sigma exist in the IT education center, but there is a need to support.

Keywords: education, customer, self-action, quality, continuous improvement process

Procedia PDF Downloads 319
15087 Harnessing Emerging Creative Technology for Knowledge Discovery of Multiwavelenght Datasets

Authors: Basiru Amuneni

Abstract:

Astronomy is one domain with a rise in data. Traditional tools for data management have been employed in the quest for knowledge discovery. However, these traditional tools become limited in the face of big. One means of maximizing knowledge discovery for big data is the use of scientific visualisation. The aim of the work is to explore the possibilities offered by emerging creative technologies of Virtual Reality (VR) systems and game engines to visualize multiwavelength datasets. Game Engines are primarily used for developing video games, however their advanced graphics could be exploited for scientific visualization which provides a means to graphically illustrate scientific data to ease human comprehension. Modern astronomy is now in the era of multiwavelength data where a single galaxy for example, is captured by the telescope several times and at different electromagnetic wavelength to have a more comprehensive picture of the physical characteristics of the galaxy. Visualising this in an immersive environment would be more intuitive and natural for an observer. This work presents a standalone VR application that accesses galaxy FITS files. The application was built using the Unity Game Engine for the graphics underpinning and the OpenXR API for the VR infrastructure. The work used a methodology known as Design Science Research (DSR) which entails the act of ‘using design as a research method or technique’. The key stages of the galaxy modelling pipeline are FITS data preparation, Galaxy Modelling, Unity 3D Visualisation and VR Display. The FITS data format cannot be read by the Unity Game Engine directly. A DLL (CSHARPFITS) which provides a native support for reading and writing FITS files was used. The Galaxy modeller uses an approach that integrates cleaned FITS image pixels into the graphics pipeline of the Unity3d game Engine. The cleaned FITS images are then input to the galaxy modeller pipeline phase, which has a pre-processing script that extracts, pixel, galaxy world position, and colour maps the FITS image pixels. The user can visualise image galaxies in different light bands, control the blend of the image with similar images from different sources or fuse images for a holistic view. The framework will allow users to build tools to realise complex workflows for public outreach and possibly scientific work with increased scalability, near real time interactivity with ease of access. The application is presented in an immersive environment and can use all commercially available headset built on the OpenXR API. The user can select galaxies in the scene, teleport to the galaxy, pan, zoom in/out, and change colour gradients of the galaxy. The findings and design lessons learnt in the implementation of different use cases will contribute to the development and design of game-based visualisation tools in immersive environment by enabling informed decisions to be made.

Keywords: astronomy, visualisation, multiwavelenght dataset, virtual reality

Procedia PDF Downloads 64
15086 Object-Oriented Modeling Simulation and Control of Activated Sludge Process

Authors: J. Fernandez de Canete, P. Del Saz Orozco, I. Garcia-Moral, A. Akhrymenka

Abstract:

Object-oriented modeling is spreading in current simulation of wastewater treatments plants through the use of the individual components of the process and its relations to define the underlying dynamic equations. In this paper, we describe the use of the free-software OpenModelica simulation environment for the object-oriented modeling of an activated sludge process under feedback control. The performance of the controlled system was analyzed both under normal conditions and in the presence of disturbances. The object-oriented described approach represents a valuable tool in teaching provides a practical insight in wastewater process control field.

Keywords: object-oriented programming, activated sludge process, OpenModelica, feedback control

Procedia PDF Downloads 361
15085 Development of a Process to Manufacture High Quality Refined Salt from Crude Solar Salt

Authors: Rathnayaka D. D. T. , Vidanage P. W. , Wasalathilake K. C. , Wickramasingha H. W. , Wijayarathne U. P. L. , Perera S. A. S.

Abstract:

This paper describes the research carried out to develop a process to increase the NaCl percentage of crude salt which is obtained from the conventional solar evaporation process. In this study refined salt was produced from crude solar salt by a chemico-physical method which consists of coagulation, precipitation and filtration. Initially crude salt crystals were crushed and dissolved in water. Optimum amounts of calcium hydroxide, sodium carbonate and Poly Aluminium Chloride (PAC) were added to the solution respectively. Refined NaCl solution was separated out by a filtration process. The solution was tested for Total Suspended Solids, SO42-, Mg2+, Ca2+. With optimum dosage of reagents, the results showed that a level of 99.60% NaCl could be achieved. Further this paper discusses the economic viability of the proposed process. A 83% profit margin can be achieved by this process and it is an increase of 112.3% compared to the traditional process.

Keywords: chemico-physical, economic, optimum, refined, solar salt

Procedia PDF Downloads 230
15084 Biochemical Characterization and Structure Elucidation of a New Cytochrome P450 Decarboxylase

Authors: Leticia Leandro Rade, Amanda Silva de Sousa, Suman Das, Wesley Generoso, Mayara Chagas Ávila, Plinio Salmazo Vieira, Antonio Bonomi, Gabriela Persinoti, Mario Tyago Murakami, Thomas Michael Makris, Leticia Maria Zanphorlin

Abstract:

Alkenes have an economic appeal, especially in the biofuels field, since they are precursors for drop-in biofuels production, which have similar chemical and physical properties to the conventional fossil fuels, with no oxygen in their composition. After the discovery of the first P450 CYP152 OleTJE in 2011, reported with its unique property of decarboxylating fatty acids (FA), by using hydrogen peroxide as a cofactor and producing 1-alkenes as the main product, the scientific and technological interest in this family of enzymes vastly increased. In this context, the present work presents a new decarboxylase (OleTRN) with low similarity with OleTJE (32%), its biochemical characterization, and structure elucidation. As main results, OleTRN presented a high yield of expression and purity, optimum reaction conditions at 35 °C and pH from 6.5 to 8.0, and higher specificity for oleic acid. Besides that, structure-guided mutations were performed and according to the functional characterizations, it was observed that some mutations presented different specificity and chemoselectivity by varying the chain-length of FA substrates from 12 to 20 carbons. These results are extremely interesting from a biotechnological perspective as those characteristics could diversify the applications and contribute to designing better cytochrome P450 decarboxylases. Considering that peroxygenases have the potential activity of decarboxylating and hydroxylating fatty acids and that the elucidation of the intriguing mechanistic involved in the decarboxylation preferential from OleTJE is still a challenge, the elucidation of OleTRN structure and the functional characterizations of OleTRN and its mutants contribute to new information about CYP152. Besides that, the work also contributed to the discovery of a new decarboxylase with a different selectivity profile from OleTJE, which allows a wide range of applications.

Keywords: P450, decarboxylases, alkenes, biofuels

Procedia PDF Downloads 166
15083 Promoting Creative and Critical Thinking in Mathematics

Authors: Ana Maria Reis D'Azevedo Breda, Catarina Maria Neto da Cruz

Abstract:

The Japanese art of origami provides a rich context for designing exploratory mathematical activities for children and young people. By folding a simple sheet of paper, fascinating and surprising planar and spatial configurations emerge. Equally surprising is the unfolding process, which also produces striking patterns. The procedure of folding, unfolding, and folding again allows the exploration of interesting geometric patterns. When adequately and systematically done, we may deduce some of the mathematical rules ruling origami. As the child/youth folds the sheet of paper repeatedly, he can physically observe how the forms he obtains are transformed and how they relate to the pattern of the corresponding unfolding, creating space for the understanding/discovery of mathematical principles regulating the folding-unfolding process. As part of a 2023 Summer Academy organized by a Portuguese university, a session entitled “Folding, Thinking and Generalizing” took place. Twenty-three students attended the session, all enrolled in the 2nd cycle of Portuguese Basic Education and aged between 10 and 12 years old. The main focus of this session was to foster the development of critical cognitive and socio-emotional skills among these young learners using origami. These skills included creativity, critical analysis, mathematical reasoning, collaboration, and communication. Employing a qualitative, descriptive, and interpretative analysis of data collected during the session through field notes and students’ written productions, our findings reveal that structured origami-based activities not only promote student engagement with mathematical concepts in a playful and interactive but also facilitate the development of socio-emotional skills, which include collaboration and effective communication between participants. This research highlights the value of integrating origami into educational practices, highlighting its role in supporting comprehensive cognitive and emotional learning experiences.

Keywords: skills, origami rules, active learning, hands-on activities

Procedia PDF Downloads 46
15082 Carrying Out the Steps of Decision Making Process in Concrete Organization

Authors: Eva Štěpánková

Abstract:

The decision-making process is theoretically clearly defined. Generally, it includes the problem identification and analysis, data gathering, goals and criteria setting, alternatives development and optimal alternative choice and its implementation. In practice however, various modifications of the theoretical decision-making process can occur. The managers can consider some of the phases to be too complicated or unfeasible and thus they do not carry them out and conversely some of the steps can be overestimated. The aim of the paper is to reveal and characterize the perception of the individual phases of decision-making process by the managers. The research is concerned with managers in the military environment–commanders. Quantitative survey is focused cross-sectionally in the individual levels of management of the Ministry of Defence of the Czech Republic. On the total number of 135 respondents the analysis focuses on which of the decision-making process phases are problematic or not carried out in practice and which are again perceived to be the easiest. Then it is examined the reasons of the findings.

Keywords: decision making, decision making process, decision problems, concrete organization

Procedia PDF Downloads 441
15081 Research on Straightening Process Model Based on Iteration and Self-Learning

Authors: Hong Lu, Xiong Xiao

Abstract:

Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.

Keywords: straightness, straightening stroke, deflection, shaft parts

Procedia PDF Downloads 305
15080 Removal of Toxic Ni++ Ions from Wastewater by Nano-Bentonite

Authors: A. M. Ahmed, Mona A. Darwish

Abstract:

Removal of Ni++ ions from aqueous solution by sorption ontoNano-bentonite was investigated. Experiments were carried out as a function amount of Nano-bentonite, pH, concentration of metal, constant time, agitation speed and temperature. The adsorption parameter of metal ions followed the Langmuir Freundlich adsorption isotherm were applied to analyze adsorption data. The adsorption process has fit pseudo-second order kinetic models. Thermodynamics parameters e.g.ΔG*, ΔS °and ΔH ° of adsorption process have also been calculated and the sorption process was found to be endothermic. The adsorption process has fit pseudo-second order kinetic models. Langmuir and Freundich adsorption isotherm models were applied to analyze adsorption data and both were found to be applicable to the adsorption process. Thermodynamic parameters, e.g., ∆G °, ∆S ° and ∆H ° of the on-going adsorption process have also been calculated and the sorption process was found to be endothermic. Finally, it can be seen that Bentonite was found to be more effective for the removal of Ni (II) same with some experimental conditions.

Keywords: waste water, nickel, bentonite, adsorption

Procedia PDF Downloads 230
15079 Assessment of Factors Influencing Business Process Harmonization: A Case Study in an Industrial Company

Authors: J. J. M. Trienekens, H. L. Romero, L. Cuenca

Abstract:

While process harmonization is increasingly mentioned and unanimously associated with several benefits, there is a need for more understanding of how it contributes to business process redesign and improvement. This paper presents the application, in an industrial case study, of a conceptual harmonization model on the relationship between drivers and effects of process harmonization. The drivers are called contextual factors which influence harmonization. Assessment of these contextual factors in a particular business domain, clarifies the extent of harmonization that can be achieved, or that should be strived at. The case study shows how the conceptual harmonization model can be made operational and can act as a valuable assessment tool. From both qualitative, as well as some quantitative, assessment results, insights are being discussed on the extent of harmonization that can be achieved, and action plans are being defined for business (process) harmonization.

Keywords: case study, contextual factors, process harmonization, industrial company

Procedia PDF Downloads 372
15078 Selecting the Best Software Product Using Analytic Hierarchy Process and Fuzzy-Analytic Hierarchy Process Modules

Authors: Anas Hourani, Batool Ahmad

Abstract:

Software applications play an important role inside any institute. They are employed to manage all processes and store entities-related data in the computer. Therefore, choosing the right software product that meets institute requirements is not an easy decision in view of considering multiple criteria, different points of views, and many standards. As a case study, Mutah University, located in Jordan, is in essential need of customized software, and several companies presented their software products which are very similar in quality. In this regard, an analytic hierarchy process (AHP) and a fuzzy analytic hierarchy process (Fuzzy-AHP) models are proposed in this research to identify the most suitable and best-fit software product that meets the institute requirements. The results indicate that both modules are able to help the decision-makers to make a decision, especially in complex decision problems.

Keywords: analytic hierarchy process, decision modeling, fuzzy analytic hierarchy process, software product

Procedia PDF Downloads 355
15077 Tripeptide Inhibitor: The Simplest Aminogenic PEGylated Drug against Amyloid Beta Peptide Fibrillation

Authors: Sutapa Som Chaudhury, Chitrangada Das Mukhopadhyay

Abstract:

Alzheimer’s disease is a well-known form of dementia since its discovery in 1906. Current Food and Drug Administration approved medications e.g. cholinesterase inhibitors, memantine offer modest symptomatic relief but do not play any role in disease modification or recovery. In last three decades many small molecules, chaperons, synthetic peptides, partial β-secretase enzyme blocker have been tested for the development of a drug against Alzheimer though did not pass the 3rd clinical phase trials. Here in this study, we designed a PEGylated, aminogenic, tripeptidic polymer with two different molecular weights based on the aggregation prone amino acid sequence 17-20 in amyloid beta (Aβ) 1-42. Being conjugated with poly-ethylene glycol (PEG) which self-assembles into hydrophilic nanoparticles, these PEGylated tripeptides constitute a very good drug delivery system crossing the blood brain barrier while the peptide remains protected from proteolytic degradation and non-specific protein interactions. Moreover, being completely aminogenic they would not raise any side effects. These peptide inhibitors were evaluated for their effectiveness against Aβ42 fibrillation at an early stage of oligomer to fibril formation as well as preformed fibril clearance via Thioflavin T (ThT) assay, dynamic light scattering analyses, atomic force microscopy and scanning electron microscopy. The inhibitors were proved to be safe at a higher concentration of 20µM by the reduction assay of 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) dye. Moreover, SHSY5Y neuroblastoma cells have shown a greater survivability when treated with the inhibitors following Aβ42 fibril and oligomer treatment as compared with the control Aβ42 fibril and/or oligomer treated neuroblastoma cells. These make the peptidic inhibitors a promising compound in the aspect of the discovery of alternative medication for Alzheimer’s disease.

Keywords: Alzheimer’s disease, alternative medication, amyloid beta, PEGylated peptide

Procedia PDF Downloads 187
15076 Potential of Mineral Composition Reconstruction for Monitoring the Performance of an Iron Ore Concentration Plant

Authors: Maryam Sadeghi, Claude Bazin, Daniel Hodouin, Laura Perez Barnuevo

Abstract:

The performance of a separation process is usually evaluated using performance indices calculated from elemental assays readily available from the chemical analysis laboratory. However, the separation process performance is essentially related to the properties of the minerals that carry the elements and not those of the elements. Since elements or metals can be carried by valuable and gangue minerals in the ore and that each mineral responds differently to a mineral processing method, the use of only elemental assays could lead to erroneous or uncertain conclusions on the process performance. This paper discusses the advantages of using performance indices calculated from minerals content, such as minerals recovery, for process performance assessments. A method is presented that uses elemental assays to estimate the minerals content of the solids in various process streams. The method combines the stoichiometric composition of the minerals and constraints of mass conservation for the minerals through the concentration process to estimate the minerals content from elemental assays. The advantage of assessing a concentration process using mineral based performance indices is illustrated for an iron ore concentration circuit.

Keywords: data reconciliation, iron ore concentration, mineral composition, process performance assessment

Procedia PDF Downloads 184
15075 Revealing the Genome Based Biosynthetic Potential of a Streptomyces sp. Isolate BR123 Presenting Broad Spectrum Antimicrobial Activities

Authors: Neelma Ashraf

Abstract:

Actinomycetes, particularly genus Streptomyces is of great importance due to their role in the discovery of new natural products, particularly antimicrobial secondary metabolites in the medicinal science and biotechnology industry. Different Streptomyces strains were isolated from Helianthus annuus plants and tested for antibacterial and antifungal activities. The most promising five strains were chosen for further investigation, and growth conditions for antibiotic synthesis were optimised. The supernatants were extracted in different solvents, and the extracted products were analyzed using liquid chromatography-mass spectrometry (LC-MS) and biological testing. From one of the potent strains Streptomyces globusus sp. BR123, a compound lavendamycin was identified using these analytical techniques. In addition, this potent strain also produces a strong antifungal polyene compound with a quasimolecular ion of 2072. Streptomyces sp. BR123 was genome sequenced because of its promising antimicrobial potential in order to identify the gene cluster responsible for analyzed compound “lavendamycin”. The genome analysis yielded candidate genes responsible for the production of this potent compound. The genome sequence of 8.15 Mb of Streptomyces sp. isolate BR123 with a GC content of 72.63% and 8103 protein coding genes was attained. Many antimicrobial, antiparasitic, and anticancerous compounds were detected through multiple biosynthetic gene clusters predicted by in-Silico analysis. Though, the novelty of metabolites was determined through the insignificant resemblance with known biosynthetic gene clusters. The current study gives insight into the bioactive potential of Streptomyces sp. isolate BR123 with respect to the synthesis of bioactive secondary metabolites through genomic and spectrometric analysis. Moreover, the comparative genome study revealed the connection of isolate BR123 with other Streptomyces strains, which could expand the knowledge of this genus and the mechanism involved in the discovery of new antimicrobial metabolites.

Keywords: streptomyces, secondary metabolites, genome, biosynthetic gene clusters, high performance liquid chromatography, mass spectrometry

Procedia PDF Downloads 47
15074 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data

Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder

Abstract:

Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.

Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods

Procedia PDF Downloads 234
15073 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 356
15072 Study the Effect of Friction on Barreling Behavior during Upsetting Process Using Anand Model

Authors: H. Mohammadi Majd, M. Jalali Azizpour, V. Tavaf, A. Jaderi

Abstract:

In upsetting processes contact friction significantly influence metal flow, stress-strain state and process parameters. Furthermore, tribological conditions influence workpiece deformation and its dimensional precision. A viscoplastic constitutive law, the Anand model, was applied to represent the inelastic deformation behavior in upsetting process. This paper presents research results of the influence of contact friction coefficient on a workpiece deformation in upsetting process.finite element parameters. This technique was tested for three different specimens simulations of the upsetting and the corresponding material and can be successfully employed to predict the deformation of the upsetting process.

Keywords: friction, upsetting, barreling, Anand model

Procedia PDF Downloads 310
15071 Chitosan Modified Halloysite Nanomaterials for Efficient and Effective Vaccine Delivery in Farmed Fish

Authors: Saji George, Eng Khuan Seng, Christof Luda

Abstract:

Nanotechnology has been recognized as an important tool for modern agriculture and has the potential to overcome some of the pressing challenges faced by aquaculture industry. A strategy for optimizing nanotechnology-based therapeutic delivery platform for immunizing farmed fish was developed. Accordingly, a compositional library of nanomaterials of natural chemistry (Halloysite (clay), Chitosan, Hydroxyapatite, Mesoporous Silica and a composite material of clay-chitosan) was screened for their toxicity and efficiency in delivering models antigens in cellular and zebrafish embryo models using high throughput screening platforms. Through multi-parametric optimization, chitosan modified halloysite (clay) nanomaterial was identified as an optimal vaccine delivery platform. Further, studies conducted in juvenile seabass showed the potential of clay-chitosan in delivering outer membrane protein of Tenacibaculum maritimum- TIMA (pathogenic bacteria) to and its efficiency in eliciting immune responses in fish. In short, as exemplified by this work, the strategy of using compositional nanomaterial libraries and their biological profiling using high-throughput screening platform could fasten the discovery process of nanomaterials with potential applications in food and agriculture.

Keywords: nanotechnology, fish-vaccine, drug-delivery, halloysite-chitosan

Procedia PDF Downloads 253
15070 Finite Volume Method Simulations of GaN Growth Process in MOVPE Reactor

Authors: J. Skibinski, P. Caban, T. Wejrzanowski, K. J. Kurzydlowski

Abstract:

In the present study, numerical simulations of heat and mass transfer during gallium nitride growth process in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Existing knowledge about phenomena occurring in the MOVPE process allows to produce high quality nitride based semiconductors. However, process parameters of MOVPE reactors can vary in certain ranges. Main goal of this study is optimization of the process and improvement of the quality of obtained crystal. In order to investigate this subject a series of computer simulations have been performed. Numerical simulations of heat and mass transfer in GaN epitaxial growth process have been performed to determine growth rate for various mass flow rates and pressures of reagents. According to the fact that it’s impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during the process, modeling is the only solution to understand the process precisely. Main heat transfer mechanisms during MOVPE process are convection and radiation. Correlation of modeling results with the experiment allows to determine optimal process parameters for obtaining crystals of highest quality.

Keywords: Finite Volume Method, semiconductors, epitaxial growth, metalorganic vapor phase epitaxy, gallium nitride

Procedia PDF Downloads 374
15069 Synthesis, Characterization, Validation of Resistant Microbial Strains and Anti Microbrial Activity of Substitted Pyrazoles

Authors: Rama Devi Kyatham, D. Ashok, K. S. K. Rao Patnaik, Raju Bathula

Abstract:

We have shown the importance of pyrazoles as anti-microbial chemical entities. These compounds have generally been considered significant due to their wide range of pharmacological acivities and their discovery motivates new avenues of research.The proposed pyrazoles were synthesized and evaluated for their anti-microbial activities. The Synthesized compounds were analyzed by different spectroscopic methods.

Keywords: pyrazoles, validation, resistant microbial strains, anti-microbial activities

Procedia PDF Downloads 143
15068 OLED Encapsulation Process Using Low Melting Point Alloy and Epoxy Mixture by Instantaneous Discharge

Authors: Kyung Min Park, Cheol Hee Moon

Abstract:

In this study we are to develop a sealing process using a mixture of a LMPA and an epoxy for the atmospheric OLED sealing process as a substitute for the thin-film process. Electrode lines were formed on the substrates, which were covered with insulating layers and sacrificial layers. A mixture of a LMPA and an epoxy was screen printed between the two electrodes. In order to generate a heat for the melting of the mixture, Joule heating method was used. Were used instantaneous discharge process for generating Joule heating. Experimental conditions such as voltage, time and constituent of the electrode were varied to optimize the heating conditions. As a result, the mixture structure of this study showed a great potential for a low-cost, low-temperature, atmospheric OLED sealing process as a substitute for the thin-film process.

Keywords: organic light emitting diode, encapsulation, low melting point alloy, joule heat

Procedia PDF Downloads 519
15067 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology

Authors: Anjian Chen, Joseph C. Chen

Abstract:

This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.

Keywords: additive manufacturing, fused deposition modeling, surface roughness, six-sigma, Taguchi method, 3D printing

Procedia PDF Downloads 353