Search results for: thermodynamic approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14305

Search results for: thermodynamic approach

13315 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 81
13314 A Fuzzy Decision Making Approach for Supplier Selection in Healthcare Industry

Authors: Zeynep Sener, Mehtap Dursun

Abstract:

Supplier evaluation and selection is one of the most important components of an effective supply chain management system. Due to the expanding competition in healthcare, selecting the right medical device suppliers offers great potential for increasing quality while decreasing costs. This paper proposes a fuzzy decision making approach for medical supplier selection. A real-world medical device supplier selection problem is presented to illustrate the application of the proposed decision methodology.

Keywords: fuzzy decision making, fuzzy multiple objective programming, medical supply chain, supplier selection

Procedia PDF Downloads 454
13313 Environmental Radioactivity Analysis by a Sequential Approach

Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab

Abstract:

Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.

Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method

Procedia PDF Downloads 497
13312 Sustainable Manufacturing Industries and Energy-Water Nexus Approach

Authors: Shahbaz Abbas, Lin Han Chiang Hsieh

Abstract:

The significant population growth and climate change issues have contributed to the natural resources depletion and their sustainability in the future. Manufacturing industries have a substantial impact on every country’s economy, but the sustainability of the industrial resources is challenging, and the policymakers have been developing the possible solutions to manage the sustainability of industrial resources such as raw material, energy, water, and industrial supply chain. In order to address these challenges, nexus approach is one of the optimization and modelling techniques in the recent sustainable environmental research. The interactions between the nexus components acknowledge that all components are dependent upon each other, and they are interrelated; therefore, their sustainability is also associated with each other. In addition, the nexus concept does not only provide the resources sustainability but also environmental sustainability can be achieved through nexus approach by utilizing the industrial waste as a resource for the industrial processes. Based on energy-water nexus, this study has developed a resource-energy-water for the sugar industry to understand the interactions between sugarcane, energy, and water towards the sustainable sugar industry. In particular, the focus of the research is the Taiwanese sugar industry; however, the same approach can be adapted worldwide to optimize the sustainability of sugar industries. It has been concluded that there are significant interactions between sugarcane, energy consumption, and water consumption in the sugar industry to manage the scarcity of resources in the future. The interactions between sugarcane and energy also deliver a mechanism to reuse the sugar industrial waste as a source of energy, consequently validating industrial and environmental sustainability. The desired outcomes from the nexus can be achieved with the modifications in the policy and regulations of Taiwanese industrial sector.

Keywords: energy-water nexus, environmental sustainability, industrial sustainability, natural resource management

Procedia PDF Downloads 125
13311 Integrated Modeling Approach for Energy Planning and Climate Change Mitigation Assessment in the State of Florida

Authors: K. Thakkar, C. Ghenai

Abstract:

An integrated modeling approach was used in this study to (1) track energy consumption, production, and resource extraction, (2) track greenhouse gases emissions and (3) analyze emissions for local and regional air pollutions. The model was used in this study for short and long term energy and GHG emissions reduction analysis for the state of Florida. The integrated modeling methodology will help to evaluate the alternative energy scenarios and examine emissions-reduction strategies. The mitigation scenarios have been designed to describe the future energy strategies. They consist of various demand and supply side scenarios. One of the GHG mitigation scenarios is crafted by taking into account the available renewable resources potential for power generation in the state of Florida to compare and analyze the GHG reduction measure against ‘Business As Usual’ and ‘Florida State Policy’ scenario. Two more ‘integrated’ scenarios, (‘Electrification’ and ‘Efficiency and Lifestyle’) are crafted through combination of various mitigation scenarios to assess the cumulative impact of the reduction measures such as technological changes and energy efficiency and conservation.

Keywords: energy planning, climate change mitigation assessment, integrated modeling approach, energy alternatives, and GHG emission reductions

Procedia PDF Downloads 443
13310 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images

Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj

Abstract:

Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.

Keywords: image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization

Procedia PDF Downloads 135
13309 Sensitivity Analysis during the Optimization Process Using Genetic Algorithms

Authors: M. A. Rubio, A. Urquia

Abstract:

Genetic algorithms (GA) are applied to the solution of high-dimensional optimization problems. Additionally, sensitivity analysis (SA) is usually carried out to determine the effect on optimal solutions of changes in parameter values of the objective function. These two analyses (i.e., optimization and sensitivity analysis) are computationally intensive when applied to high-dimensional functions. The approach presented in this paper consists in performing the SA during the GA execution, by statistically analyzing the data obtained of running the GA. The advantage is that in this case SA does not involve making additional evaluations of the objective function and, consequently, this proposed approach requires less computational effort than conducting optimization and SA in two consecutive steps.

Keywords: optimization, sensitivity, genetic algorithms, model calibration

Procedia PDF Downloads 437
13308 A Metaheuristic for the Layout and Scheduling Problem in a Job Shop Environment

Authors: Hernández Eva Selene, Reyna Mary Carmen, Rivera Héctor, Barragán Irving

Abstract:

We propose an approach that jointly addresses the layout of a facility and the scheduling of a sequence of jobs. In real production, these two problems are interrelated. However, they are treated separately in the literature. Our approach is an extension of the job shop problem with transportation delay, where the location of the machines is selected among possible sites. The model minimizes the makespan, using the short processing times rule with two algorithms; the first one considers all the permutations for the location of machines, and the second only a heuristic to select some specific permutations that reduces computational time. Some instances are proved and compared with literature.

Keywords: layout problem, job shop scheduling problem, concurrent scheduling and layout problem, metaheuristic

Procedia PDF Downloads 610
13307 Impact of Curvatures in the Dike Line on Wave Run-up and Wave Overtopping, ConDike-Project

Authors: Malte Schilling, Mahmoud M. Rabah, Sven Liebisch

Abstract:

Wave run-up and overtopping are the relevant parameters for the dimensioning of the crest height of dikes. Various experimental as well as numerical studies have investigated these parameters under different boundary conditions (e.g. wave conditions, structure type). Particularly for the dike design in Europe, a common approach is formulated where wave and structure properties are parameterized. However, this approach assumes equal run-up heights and overtopping discharges along the longitudinal axis. However, convex dikes have a heterogeneous crest by definition. Hence, local differences in a convex dike line are expected to cause wave-structure interactions different to a straight dike. This study aims to assess both run-up and overtopping at convexly curved dikes. To cast light on the relevance of curved dikes for the design approach mentioned above, physical model tests were conducted in a 3D wave basin of the Ludwig-Franzius-Institute Hannover. A dike of a slope of 1:6 (height over length) was tested under both regular waves and TMA wave spectra. Significant wave heights ranged from 7 to 10 cm and peak periods from 1.06 to 1.79 s. Both run-up and overtopping was assessed behind the curved and straight sections of the dike. Both measurements were compared to a dike with a straight line. It was observed that convex curvatures in the longitudinal dike line cause a redirection of incident waves leading to a concentration around the center point. Measurements prove that both run-up heights and overtopping rates are higher than on the straight dike. It can be concluded that deviations from a straight longitudinal dike line have an impact on design parameters and imply uncertainties within the design approach in force. Therefore, it is recommended to consider these influencing factors for such cases.

Keywords: convex dike, longitudinal curvature, overtopping, run-up

Procedia PDF Downloads 293
13306 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis

Authors: Alexander A. Tokmakov

Abstract:

Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.

Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins

Procedia PDF Downloads 419
13305 Investigate and Control Thermal Spectra in Nanostructures and 2D Van der Waals Materials

Authors: Joon Sang Kang, Ming Ke, Yongjie Hu

Abstract:

Controlling heat transfer and thermal properties of materials is important to many fields such as energy efficiency and thermal management of integrated circuits. Significant progress over the past decade has been made to improve material performance through structuring at the nanoscale, however a clear relationship between structure dimensions, interfaces, and thermal properties remains to be established. The main challenge comes from the unknown intrinsic spectral contribution from different phonons. Here, we describe our current progress on quantifying and controlling thermal spectra based on our recently developed technical approach using ultrafast optical spectroscopy. Our work brings further the promise of rational material design to achieve high performance through a synergistic experimental-modeling approach. This approach can be broadly applicable to a wide range of materials and energy systems. In particular, we demonstrate in-situ characterization and tunable thermal properties of 2D van der waals materials through ionic intercalations. The significant impacts of this research in improving the efficiency of thermal energy conversion and management will also be illustrated.

Keywords: energy, mean free path, nanoscale heat transfer, nanostructure, phonons, TDTR, thermoelectrics, 2D materials

Procedia PDF Downloads 288
13304 Fiscal Stability Indicators and Public Debt Trajectory in Croatia

Authors: Hrvoje Simovic

Abstract:

Paper analyses the key problems of fiscal sustainability in Croatia. To point out key challenges of fiscal sustainability, the public debt sustainability is analyzed using standard indicators of fiscal stability, accompanied with the identification of regime changes approach in the public debt trajectory using switching regression approach. The analysis is conducted for the period from 2001 to 2016. Results show huge vulnerability in recession period (2009-14), so key challenges in current fiscal policy and public debt management are recognized in maturity prolongation, interest rates trends, and credit rating expectations.

Keywords: fiscal sustainability, public debt, Croatia, budget deficit

Procedia PDF Downloads 261
13303 An Inverse Heat Transfer Algorithm for Predicting the Thermal Properties of Tumors during Cryosurgery

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study aimed at developing an inverse heat transfer approach for predicting the time-varying freezing front and the temperature distribution of tumors during cryosurgery. Using a temperature probe pressed against the layer of tumor, the inverse approach is able to predict simultaneously the metabolic heat generation and the blood perfusion rate of the tumor. Once these parameters are predicted, the temperature-field and time-varying freezing fronts are determined with the direct model. The direct model rests on one-dimensional Pennes bioheat equation. The phase change problem is handled with the enthalpy method. The Levenberg-Marquardt Method (LMM) combined to the Broyden Method (BM) is used to solve the inverse model. The effect (a) of the thermal properties of the diseased tissues; (b) of the initial guesses for the unknown thermal properties; (c) of the data capture frequency; and (d) of the noise on the recorded temperatures is examined. It is shown that the proposed inverse approach remains accurate for all the cases investigated.

Keywords: cryosurgery, inverse heat transfer, Levenberg-Marquardt method, thermal properties, Pennes model, enthalpy method

Procedia PDF Downloads 201
13302 Single Valued Neutrosophic Hesitant Fuzzy Rough Set and Its Application

Authors: K. M. Alsager, N. O. Alshehri

Abstract:

In this paper, we proposed the notion of single valued neutrosophic hesitant fuzzy rough set, by combining single valued neutrosophic hesitant fuzzy set and rough set. The combination of single valued neutrosophic hesitant fuzzy set and rough set is a powerful tool for dealing with uncertainty, granularity and incompleteness of knowledge in information systems. We presented both definition and some basic properties of the proposed model. Finally, we gave a general approach which is applied to a decision making problem in disease diagnoses, and demonstrated the effectiveness of the approach by a numerical example.

Keywords: single valued neutrosophic fuzzy set, single valued neutrosophic fuzzy hesitant set, rough set, single valued neutrosophic hesitant fuzzy rough set

Procedia PDF Downloads 277
13301 Brand Identity Creation for Thai Halal Brands

Authors: Pibool Waijittragum

Abstract:

The purpose of this paper is to synthesize the research result of brand Identities of Thai Halal brands which related to the way of life for Thai Muslims. The results will be transforming to Thai Halal Brands packaging and label design. The expected benefit is an alternative of marketing strategy for brand building process for Halal products in Thailand. Four elements of marketing strategies which necessary for the brand identity creation is the research framework: consists of Attributes, Benefits, Values and Personality. The research methodology was applied using qualitative and quantitative; 19 marketing experts with dynamic roles in Thai consumer products were interviewed. In addition, a field survey of 122 Thai Muslims selected from 175 Muslim communities in Bangkok was studied. Data analysis will be according to 5 categories of Thai Halal product: 1) Meat 2) Vegetable and Fruits 3) Instant foods and Garnishing ingredient 4) Beverages, Desserts and Snacks 5) Hygienic daily products. The results will explain some suitable approach for brand Identities of Thai Halal brands as are: 1) Benefit approach as the characteristics of the product with its benefit. The brand identity created transform to the packaging design should be clear and display a fresh product 2) Value approach as the value of products that affect to consumers’ perception. The brand identity created transform to the packaging design should be simply look and using a trustful image 3) Personality approach as the reflection of consumers thought. The brand identity created transform to the packaging design should be sincere, enjoyable, merry, flamboyant look and using a humoristic image.

Keywords: marketing strategies, brand identity, packaging and label design, Thai Halal products

Procedia PDF Downloads 437
13300 Multidimensional Approach to Analyse the Environmental Impacts of Mobility

Authors: Andras Gyorfi, Andras Torma, Adrienn Buruzs

Abstract:

Mobility has been evolved to a determining field of science. The continuously developing segment involves a variety of affected issues such as public and economic sectors. Beside the changes in mobility the state of environment had also changed in the last period. Alternative mobility as a separate category and the idea of its widespread appliance is such a new field that needs to be studied deeper. Alternative mobility implies finding new types of propulsion, using innovative kinds of power and energy resources, revolutionizing the approach to vehicular control. Including new resources and excluding others has such a complex effect which cannot be unequivocally confirmed by today’s scientific achievements. Changes in specific parameters will most likely reduce the environmental impacts, however, the production of new substances or even their subtraction of the system will cause probably energy deficit as well. The aim of this research is to elaborate the environmental impact matrix of alternative mobility and cognize the factors that are yet unknown, analyse them, look for alternative solutions and conclude all the above in a coherent system. In order to this, we analyse it with a method called ‘the system of systems (SoS) method’ to model the effects and the dynamics of the system. A part of the research process is to examine its impacts on the environment, and to decide whether the newly developed versions of alternative mobility are affecting the environmental state. As a final result, a complex approach will be used which can supplement the current scientific studies. By using the SoS approach, we create a framework of reference containing elements in which we examine the interactions as well. In such a way, a flexible and modular model can be established which supports the prioritizing of effects and the deeper analysis of the complex system.

Keywords: environment, alternative mobility, complex model, element analysis, multidimensional map

Procedia PDF Downloads 327
13299 Mentoring Writing Skills: A Classroom Friendly Approach

Authors: Pradeep Kumar Sahoo

Abstract:

Facilitating writing skill among the young techies seems a bit challenging. Various factors may owe to this difficulty. Inappropriate syllabus, inadequate infrastructure, to some extent, untrained faculty members and above all the background of learners may be treated as the components that make the process challenging. In order to convert/create/prepare writing skill friendly, the focused items will have to be different from the classroom the present day traditional classroom situation. This paper focuses on the multiple contemporary strategies for approaching a wide range of typical problems that the writers face in a specific technical university of Odisha.

Keywords: background of learners, classroom friendly approach, inappropriate syllabus, traditional classroom situation

Procedia PDF Downloads 337
13298 Prediction of Binding Free Energies for Dyes Removal Using Computational Chemistry

Authors: R. Chanajaree, D. Luanwiset, K. Pongpratea

Abstract:

Dye removal is an environmental concern because the textile industries have been increasing by world population and industrialization. Adsorption is the technique to find adsorbents to remove dyes from wastewater. This method is low-cost and effective for dye removal. This work tries to develop effective adsorbents using the computational approach because it will be able to predict the possibility of the adsorbents for specific dyes in terms of binding free energies. The computational approach is faster and cheaper than the experimental approach in case of finding the best adsorbents. All starting structures of dyes and adsorbents are optimized by quantum calculation. The complexes between dyes and adsorbents are generated by the docking method. The obtained binding free energies from docking are compared to binding free energies from the experimental data. The calculated energies can be ranked as same as the experimental results. In addition, this work also shows the possible orientation of the complexes. This work used two experimental groups of the complexes of the dyes and adsorbents. In the first group, there are chitosan (adsorbent) and two dyes (reactive red (RR) and direct sun yellow (DY)). In the second group, there are poly(1,2-epoxy-3-phenoxy) propane (PEPP), which is the adsorbent, and 2 dyes of bromocresol green (BCG) and alizarin yellow (AY).

Keywords: dyes removal, binding free energies, quantum calculation, docking

Procedia PDF Downloads 155
13297 Recovery of Selenium from Scrubber Sludge in Copper Process

Authors: Lakshmikanth Reddy, Bhavin Desai, Chandrakala Kari, Sanjay Sarkar, Pradeep Binu

Abstract:

The sulphur dioxide gases generated as a by-product of smelting and converting operations of copper concentrate contain selenium apart from zinc, lead, copper, cadmium, bismuth, antimony, and arsenic. The gaseous stream is treated in waste heat boiler, electrostatic precipitator and scrubbers to remove coarse particulate matter in order to produce commercial grade sulfuric acid. The gas cleaning section of the acid plant uses water to scrub the smelting gases. After scrubbing, the sludge settled at the bottom of the scrubber, was analyzed in present investigation. It was found to contain 30 to 40 wt% copper and selenium up to 40 wt% selenium. The sludge collected during blow-down is directly recycled to the smelter for copper recovery. However, the selenium is expected to again vaporize due to high oxidation potential during smelting and converting, causing accumulation of selenium in sludge. In present investigation, a roasting process has been developed to recover the selenium before the copper recovery from the sludge at smelter. Selenium is associated with copper in sludge as copper selenide, as determined by X-ray diffraction and electron microscopy. The thermodynamic and thermos-gravimetry study revealed that the copper selenide phase present in the sludge was amenable to oxidation at 600°C forming oxides of copper and selenium (Cu-Se-O). However, the dissociation of selenium from the copper oxide was made possible by sulfatation using sulfur dioxide between 450 to 600°C, resulting into the formation of CuSO₄ (s) and SeO₂ (g). Lab scale trials were carried out in vertical tubular furnace to determine the optimum roasting conditions with respect to roasting time, temperature and molar ratio of O₂:SO₂. Using these optimum conditions, selenium up to 90 wt% in the form of SeO₂ vapors could be recovered from the sludge in a large-scale commercial roaster. Roasted sludge free from the selenium and containing oxides and sulfates of copper could now be recycled in the smelter for copper recovery.

Keywords: copper, selenium, copper selenide, sludge, roasting, SeO₂

Procedia PDF Downloads 206
13296 Managing Core Competencies in Innovative Entrepreneurship: Theory and Practice

Authors: Olga Shvetsova

Abstract:

The research paper contains the different issues of competence management in innovation companies. The theoretical bases of human resources management and practical issues of innovative enterprises’ competitiveness are considered. The research is focused on the modern innovative enterprise management problems; it focuses on the effective management of the personnel of innovative enterprises on the basis of competence approach. The concept of core competence approach is discussed. The point of view, that the key competences of the company create the competitive advantages, support strategy development and protect business from external negative factors is considered. The used methodology is background research.

Keywords: competence model, competitiveness, innovation management, implementation

Procedia PDF Downloads 318
13295 Analyzing Defects with Failure Assessment Diagrams of Gas Pipelines

Authors: Alfred Hasanaj , Ardit Gjeta, Miranda Kullolli

Abstract:

The approach in analyzing defects on different pipe lines is conducted through Failure Assessment Diagram (FAD). These methods of analyses have further extended in recent years. This approach is used to identify and stress out a solution for the defects which randomly occur with gas pipes such are corrosion defects, gauge defects, and combination of defects where gauge and dents are included. Few of the defects are to be analyzed in this paper where our main focus will be the fracture of cast Iron pipes, elastic-plastic failure and plastic collapse of X52 steel pipes for gas transport. We need to conduct a calculation of probability of the defects in order to predict and avoid such costly defects.

Keywords: defects, failure assessment diagrams, steel pipes, safety factor

Procedia PDF Downloads 446
13294 Conceptual Model for Knowledge Sharing Model in Creating Idea for Mobile Application

Authors: Hanafizan Hussain

Abstract:

This study shows that several projects will be conducted at the workshop in which using the conceptual model for knowledge sharing approach to create an idea for mobile application. The sharing idea has been done through the collaborative activity in which a group of different field sought to define the mobile application which will lead to new media approach of using social media platform. The collaborative activity will be provided and implemented in the form of one day workshop to determine the approach towards the theme given. The activity later will be continued for four weeks for the participant to prepare for the pitch day workshop. This paper shows the pitch of idea including the interface and prototype for the said products. The collaboration between the members with different field of study shows that social media influenced the knowledge sharing model and its creation or innovations. One of the projects supported a collaborative activity in which a group of young designers sought to define the knowledge sharing model of their ability in creating idea for mobile applications.

Keywords: mobile application, collaborative activity, conceptual knowledge sharing model, social media platform

Procedia PDF Downloads 143
13293 Artificial Intelligence-Generated Previews of Hyaluronic Acid-Based Treatments

Authors: Ciro Cursio, Giulia Cursio, Pio Luigi Cursio, Luigi Cursio

Abstract:

Communication between practitioner and patient is of the utmost importance in aesthetic medicine: as of today, images of previous treatments are the most common tool used by doctors to describe and anticipate future results for their patients. However, using photos of other people often reduces the engagement of the prospective patient and is further limited by the number and quality of pictures available to the practitioner. Pre-existing work solves this issue in two ways: 3D scanning of the area with manual editing of the 3D model by the doctor or automatic prediction of the treatment by warping the image with hand-written parameters. The first approach requires the manual intervention of the doctor, while the second approach always generates results that aren’t always realistic. Thus, in one case, there is significant manual work required by the doctor, and in the other case, the prediction looks artificial. We propose an AI-based algorithm that autonomously generates a realistic prediction of treatment results. For the purpose of this study, we focus on hyaluronic acid treatments in the facial area. Our approach takes into account the individual characteristics of each face, and furthermore, the prediction system allows the patient to decide which area of the face she wants to modify. We show that the predictions generated by our system are realistic: first, the quality of the generated images is on par with real images; second, the prediction matches the actual results obtained after the treatment is completed. In conclusion, the proposed approach provides a valid tool for doctors to show patients what they will look like before deciding on the treatment.

Keywords: prediction, hyaluronic acid, treatment, artificial intelligence

Procedia PDF Downloads 116
13292 Thick Data Analytics for Learning Cataract Severity: A Triplet Loss Siamese Neural Network Model

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Diagnosing cataract severity is an important factor in deciding to undertake surgery. It is usually conducted by an ophthalmologist or through taking a variety of fundus photography that needs to be examined by the ophthalmologist. This paper carries out an investigation using a Siamese neural net that can be trained with small anchor samples to score cataract severity. The model used in this paper is based on a triplet loss function that takes the ophthalmologist best experience in rating positive and negative anchors to a specific cataract scaling system. This approach that takes the heuristics of the ophthalmologist is generally called the thick data approach, which is a kind of machine learning approach that learn from a few shots. Clinical Relevance: The lens of the eye is mostly made up of water and proteins. A cataract occurs when these proteins at the eye lens start to clump together and block lights causing impair vision. This research aims at employing thick data machine learning techniques to rate the severity of the cataract using Siamese neural network.

Keywords: thick data analytics, siamese neural network, triplet-loss model, few shot learning

Procedia PDF Downloads 113
13291 A Polynomial Approach for a Graphical-based Integrated Production and Transport Scheduling with Capacity Restrictions

Authors: M. Ndeley

Abstract:

The performance of global manufacturing supply chains depends on the interaction of production and transport processes. Currently, the scheduling of these processes is done separately without considering mutual requirements, which leads to no optimal solutions. An integrated scheduling of both processes enables the improvement of supply chain performance. The integrated production and transport scheduling problem (PTSP) is NP-hard, so that heuristic methods are necessary to efficiently solve large problem instances as in the case of global manufacturing supply chains. This paper presents a heuristic scheduling approach which handles the integration of flexible production processes with intermodal transport, incorporating flexible land transport. The method is based on a graph that allows a reformulation of the PTSP as a shortest path problem for each job, which can be solved in polynomial time. The proposed method is applied to a supply chain scenario with a manufacturing facility in South Africa and shipments of finished product to customers within the Country. The obtained results show that the approach is suitable for the scheduling of large-scale problems and can be flexibly adapted to different scenarios.

Keywords: production and transport scheduling problem, graph based scheduling, integrated scheduling

Procedia PDF Downloads 475
13290 Multi-Criteria Decision Approach to Performance Measurement Techniques Data Envelopment Analysis: Case Study of Kerman City’s Parks

Authors: Ali A. Abdollahi

Abstract:

During the last several decades, scientists have consistently applied Multiple Criteria Decision-Making methods in making decisions about multi-faceted, complicated subjects. While making such decisions and in order to achieve more accurate evaluations, they have regularly used a variety of criteria instead of applying just one Optimum Evaluation Criterion. The method presented here utilizes both ‘quantity’ and ‘quality’ to assess the function of the Multiple-Criteria method. Applying Data envelopment analysis (DEA), weighted aggregated sum product assessment (WASPAS), Weighted Sum Approach (WSA), Analytic Network Process (ANP), and Charnes, Cooper, Rhodes (CCR) methods, we have analyzed thirteen parks in Kerman city. It further indicates that the functions of WASPAS and WSA are compatible with each other, but also that their deviation from DEA is extensive. Finally, the results for the CCR technique do not match the results of the DEA technique. Our study indicates that the ANP method, with the average rate of 1/51, ranks closest to the DEA method, which has an average rate of 1/49.

Keywords: multiple criteria decision making, Data envelopment analysis (DEA), Charnes Cooper Rhodes (CCR), Weighted Sum Approach (WSA)

Procedia PDF Downloads 221
13289 Decision-Making Process and Its Method: Effective Usage Strategies

Authors: Kubra Korkmaz Onat

Abstract:

Decision-making significantly influences outcomes and shapes future actions, making it a crucial aspect of both personal and professional life. This study examines various decision-making approaches, focusing on their procedures and applications. The rational decision-making model is highlighted for its systematic approach and reliance on data analysis and logical reasoning. Additionally, the study explores consensus, weighted scoring, voting, and brainstorming analysis methods. Key findings indicate that each method has unique strengths and is best suited for specific contexts. The article concludes by offering practical guidance for how to choose the appropriate decision-making approach based on the circumstances.

Keywords: decision-making, decision-making process, decision-making methods, group decision-making

Procedia PDF Downloads 8
13288 Sharing Experience in Authentic Learning for Mobile Security

Authors: Kai Qian, Lixin Tao

Abstract:

Mobile devices such as smartphones are getting more and more popular in our daily lives. The security vulnerability and threat attacks become a very emerging and important research and education topic in computing security discipline. There is a need to have an innovative mobile security hands-on laboratory to provide students with real world relevant mobile threat analysis and protection experience. This paper presents an authentic teaching and learning mobile security approach with smartphone devices which covers most important mobile threats in most aspects of mobile security. Each lab focuses on one type of mobile threats, such as mobile messaging threat, and conveys the threat analysis and protection in multiple ways, including lectures and tutorials, multimedia or app-based demonstration for threats analysis, and mobile app development for threat protections. This authentic learning approach is affordable and easily-adoptable which immerse students in a real world relevant learning environment with real devices. This approach can also be applied to many other mobile related courses such as mobile Java programming, database, network, and any security relevant courses so that can learn concepts and principles better with the hands-on authentic learning experience.

Keywords: mobile computing, Android, network, security, labware

Procedia PDF Downloads 407
13287 Methods of Improving Production Processes Based on Deming Cycle

Authors: Daniel Tochwin

Abstract:

Continuous improvement is an essential part of effective process performance management. In order to achieve continuous quality improvement, each organization must use the appropriate selection of tools and techniques. The basic condition for success is a proper understanding of the business need faced by the company and the selection of appropriate methods to improve a given production process. The main aim of this article is to analyze the methods of conduct which are popular in practice when implementing process improvements and then to determine whether the tested methods include repetitive systematics of the approach, i.e., a similar sequence of the same or similar actions. Based on an extensive literature review, 4 methods of continuous improvement of production processes were selected: A3 report, Gemba Kaizen, PDCA cycle, and Deming cycle. The research shows that all frequently used improvement methods are generally based on the PDCA cycle, and the differences are due to "(re)interpretation" and the need to adapt the continuous improvement approach to the specific business process. The research shows that all the frequently used improvement methods are generally based on the PDCA cycle, and the differences are due to "(re) interpretation" and the need to adapt the continuous improvement approach to the specific business process.

Keywords: continuous improvement, lean methods, process improvement, PDCA

Procedia PDF Downloads 80
13286 A Cohort and Empirical Based Multivariate Mortality Model

Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong

Abstract:

This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.

Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management

Procedia PDF Downloads 56