Search results for: optimization framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7871

Search results for: optimization framework

6101 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification

Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang

Abstract:

This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.

Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI

Procedia PDF Downloads 76
6100 CRYPTO COPYCAT: A Fashion Centric Blockchain Framework for Eliminating Fashion Infringement

Authors: Magdi Elmessiry, Adel Elmessiry

Abstract:

The fashion industry represents a significant portion of the global gross domestic product, however, it is plagued by cheap imitators that infringe on the trademarks which destroys the fashion industry's hard work and investment. While eventually the copycats would be found and stopped, the damage has already been done, sales are missed and direct and indirect jobs are lost. The infringer thrives on two main facts: the time it takes to discover them and the lack of tracking technologies that can help the consumer distinguish them. Blockchain technology is a new emerging technology that provides a distributed encrypted immutable and fault resistant ledger. Blockchain presents a ripe technology to resolve the infringement epidemic facing the fashion industry. The significance of the study is that a new approach leveraging the state of the art blockchain technology coupled with artificial intelligence is used to create a framework addressing the fashion infringement problem. It transforms the current focus on legal enforcement, which is difficult at best, to consumer awareness that is far more effective. The framework, Crypto CopyCat, creates an immutable digital asset representing the actual product to empower the customer with a near real time query system. This combination emphasizes the consumer's awareness and appreciation of the product's authenticity, while provides real time feedback to the producer regarding the fake replicas. The main findings of this study are that implementing this approach can delay the fake product penetration of the original product market, thus allowing the original product the time to take advantage of the market. The shift in the fake adoption results in reduced returns, which impedes the copycat market and moves the emphasis to the original product innovation.

Keywords: fashion, infringement, blockchain, artificial intelligence, textiles supply chain

Procedia PDF Downloads 247
6099 Security Practices of the European Union on Migration: An Analysis of the Frontex Within the Framework of Biopolitics

Authors: Gizem Ertürk, Nursena Dinç

Abstract:

The Aegean area has always been an important transit point for migration; however, the establishment of the European Union gave further impetus to the migration phenomenon and increased the significance of the area within this context. The migration waves have been more visible in the area in recent decades, and particularly after the “2015 Migration Crisis,” this issue has been subject to further securitization in the EU. In this conjuncture, the Frontex, which is an agency set up by the EU in 2005 for the purpose of managing and coordinating the border control efforts, has become more functional in the relevant area, but at the same time, have some questionable actions within the context of human rights. This paper problematizes the rationality behind the existence and practices of such a structure and attempts to make a political and legal analysis of the security practices of the European Union against migration within a framework based on the biopolitics approaches of Michel Foucault and Giorgio Agamben. The dataset of this paper, which focuses on the agency in question by taking it as a case, is formed by making use of the existing literature on the EU’s security policies, the relevant official texts of the Union and Frontex reports on migration practices in and around the Aegean Sea.

Keywords: migration, biopolitics, Frontex, security, European union, securitization

Procedia PDF Downloads 124
6098 An Unsupervised Domain-Knowledge Discovery Framework for Fake News Detection

Authors: Yulan Wu

Abstract:

With the rapid development of social media, the issue of fake news has gained considerable prominence, drawing the attention of both the public and governments. The widespread dissemination of false information poses a tangible threat across multiple domains of society, including politics, economy, and health. However, much research has concentrated on supervised training models within specific domains, their effectiveness diminishes when applied to identify fake news across multiple domains. To solve this problem, some approaches based on domain labels have been proposed. By segmenting news to their specific area in advance, judges in the corresponding field may be more accurate on fake news. However, these approaches disregard the fact that news records can pertain to multiple domains, resulting in a significant loss of valuable information. In addition, the datasets used for training must all be domain-labeled, which creates unnecessary complexity. To solve these problems, an unsupervised domain knowledge discovery framework for fake news detection is proposed. Firstly, to effectively retain the multidomain knowledge of the text, a low-dimensional vector for each news text to capture domain embeddings is generated. Subsequently, a feature extraction module utilizing the unsupervisedly discovered domain embeddings is used to extract the comprehensive features of news. Finally, a classifier is employed to determine the authenticity of the news. To verify the proposed framework, a test is conducted on the existing widely used datasets, and the experimental results demonstrate that this method is able to improve the detection performance for fake news across multiple domains. Moreover, even in datasets that lack domain labels, this method can still effectively transfer domain knowledge, which can educe the time consumed by tagging without sacrificing the detection accuracy.

Keywords: fake news, deep learning, natural language processing, multiple domains

Procedia PDF Downloads 72
6097 Finite Element Analysis of Connecting Rod

Authors: Mohammed Mohsin Ali H., Mohamed Haneef

Abstract:

The connecting rod transmits the piston load to the crank causing the latter to turn, thus converting the reciprocating motion of the piston into a rotary motion of the crankshaft. Connecting rods are subjected to forces generated by mass and fuel combustion. This study investigates and compares the fatigue behavior of forged steel, powder forged and ASTM a 514 steel cold quenched connecting rods. The objective is to suggest for a new material with reduced weight and cost with the increased fatigue life. This has entailed performing a detailed load analysis. Therefore, this study has dealt with two subjects: first, dynamic load and stress analysis of the connecting rod, and second, optimization for material, weight and cost. In the first part of the study, the loads acting on the connecting rod as a function of time were obtained. Based on the observations of the dynamic FEA, static FEA, and the load analysis results, the load for the optimization study was selected. It is the conclusion of this study that the connecting rod can be designed and optimized under a load range comprising tensile load and compressive load. Tensile load corresponds to 360o crank angle at the maximum engine speed. The compressive load is corresponding to the peak gas pressure. Furthermore, the existing connecting rod can be replaced with a new connecting rod made of ASTM a 514 steel cold quenched that is 12% lighter and 28% cheaper.

Keywords: connecting rod, ASTM a514 cold quenched material, static analysis, fatigue analysis, stress life approach

Procedia PDF Downloads 291
6096 Behavior of the Masonry Infill in Structures Subjected to the Horizontal Loads

Authors: Mezigheche Nawel, Gouasmia Abdelhacine, Athmani Allaeddine, Merzoud Mouloud

Abstract:

Masonry infill walls are inevitable in the self-supporting structures, but their contribution in the resistance of earthquake loads is generally neglected in the structural analyses. The principal aim of this work through a numerical study of the behavior of masonry infill walls in structures subjected to horizontal load is to propose by finite elements numerical modeling, a more reliable approach, faster and close to reality. In this study, 3D finite element analysis was developed to study the behavior of masonry infill walls in structures subjected to horizontal load: The finite element software being used was ABAQUS, it is observed that more rigidity of the masonry filling is significant, more the structure is rigid, so we can conclude that the filling brings an additional rigidity to the structure not to be neglected. It is also observed that when the framework is subjected to horizontal loads, the framework separates from the filling on the level of the tended diagonal.

Keywords: finite element, masonry infill walls, rigidity of the masonry, tended diagonal

Procedia PDF Downloads 476
6095 Optimization of Culture Conditions of Paecilomyces tenuipes, Entomopathogenic Fungi Inoculated into the Silkworm Larva, Bombyx mori

Authors: Sunghee Nam

Abstract:

Entomopathogenic fungi is a Cordyceps species that is isolated from dead silkworm and cicada. Fungi on cicadas were described in old Chinese medicinal books and from ancient times, vegetable wasps and plant worms were widely known to have active substance and have been studied for pharmacological use. Among many fungi belonging to the genus Cordyceps, Cordyceps sinensis have been demonstrated to yield natural products possessing various biological activities and many bioactive components. Generally, It is commonly used to replenish the kidney and soothe the lung, and for the treatment of fatigue. Due to their commercial and economic importance, the demand for Cordyceps has been rapidly increased. However, a supply of Cordyceps specimen could not meet the increasing demand because of their sole dependence on field collection and habitat destruction. Because it is difficult to obtain many insect hosts in nature and the edibility of host insect needs to be verified in a pharmacological aspect. Recently, this setback was overcome that P. tenuipes was able to be cultivated in a large scale using silkworm as host. Pharmacological effects of P. tenuipes cultured on silkworm such as strengthening immune function, anti-fatigue, anti-tumor activity and controlling liver etc. have been proved. They are widely commercialized. In this study, we attempted to establish a method for stable growth inhibition of P. tenuipes on silkworm hosts and an optimal condition for synnemata formation. To determine optimum culturing conditions, temperature and light conditions were varied. The length and number of synnemata was highest at 25℃ temperature and 100~300 lux illumination. On an average, the synnemata of wild P. tenuipes measures 70 ㎜ in length and 20 in number; those of the cultured strain were relatively shorter and more in number. The number of synnemata may have increased as a result of inoculating the host with highly concentrated conidia, while the length may have decreased due to limited nutrition per individual. It is not able that changes in light illumination cause morphological variations in the synnemata. However, regulation of only light and temperature could not produce stromata like perithecia, asci, and ascospores.

Keywords: optimization of culture conditions of paecilomyces tenuipes, entomopathogenic fungi optimization of culture conditions of paecilomyces tenuipes, entomopathogenic fungi silkworm larva, bombyx mori

Procedia PDF Downloads 243
6094 Detecting Geographically Dispersed Overlay Communities Using Community Networks

Authors: Madhushi Bandara, Dharshana Kasthurirathna, Danaja Maldeniya, Mahendra Piraveenan

Abstract:

Community detection is an extremely useful technique in understanding the structure and function of a social network. Louvain algorithm, which is based on Newman-Girman modularity optimization technique, is extensively used as a computationally efficient method extract the communities in social networks. It has been suggested that the nodes that are in close geographical proximity have a higher tendency of forming communities. Variants of the Newman-Girman modularity measure such as dist-modularity try to normalize the effect of geographical proximity to extract geographically dispersed communities, at the expense of losing the information about the geographically proximate communities. In this work, we propose a method to extract geographically dispersed communities while preserving the information about the geographically proximate communities, by analyzing the ‘community network’, where the centroids of communities would be considered as network nodes. We suggest that the inter-community link strengths, which are normalized over the community sizes, may be used to identify and extract the ‘overlay communities’. The overlay communities would have relatively higher link strengths, despite being relatively apart in their spatial distribution. We apply this method to the Gowalla online social network, which contains the geographical signatures of its users, and identify the overlay communities within it.

Keywords: social networks, community detection, modularity optimization, geographically dispersed communities

Procedia PDF Downloads 221
6093 Development of an Integrated Framework for Life-Cycle Economic, Environmental and Human Health Impact Assessment for Reclaimed Water Use in Water Systems of Various Scales

Authors: Yu-Yao Wang, Xiao-Meng Hu, Joanne Yeung, Xiao-Yan Li

Abstract:

The high private cost and unquantified external cost limit the development of reclaimed water. In this study, an integrated framework comprising life cycle assessment (LCA), quantitative microbial risk assessment (QMRA), and life cycle costing (LCC) was developed to evaluate both costs of reclaimed water supply in water systems of various scales. LCA assesses the environmental impacts, and QMRA estimates the associated pathogenic impacts. These impacts are monetized as external costs and analyzed with the private cost by LCC to count the total life cycle cost. The framework evaluated the Hong Kong urban water system in the baseline scenario (BS) and five wastewater reuse scenarios (RS). They are RSI: substituting freshwater for toilet flushing only, RSII: substituting both freshwater and seawater for toilet flushing, RSIII: using reclaimed water for all non-potable uses, RSIV: using reclaimed water for all non-potable uses and indirect potable uses, and RSV: non-potable use and indirect potable use by conveying 100% reclaimed water to recharge the reservoirs. The results show that substituting freshwater and seawater for toilet flushing has the least total life cycle cost, exhibiting that it is the most cost-effective option for Hong Kong. Meanwhile, the evaluation results show that the external cost of each scenario is comparable to the corresponding private cost, indicating the importance of the inclusion of comprehensive external cost evaluation in private cost assessment of water systems with reclaimed water supply.

Keywords: life cycle assessment, life cycle costing, quantitative microbial risk assessment, water reclamation, reclaimed water, alternative water resources

Procedia PDF Downloads 110
6092 Numerical Simulation of Ultraviolet Disinfection in a Water Reactor

Authors: H. Shokouhmand, H. Sobhani, B. Sajadi, M. Degheh

Abstract:

In recent years, experimental and numerical investigation of water UV reactors has increased significantly. The main drawback of experimental methods is confined and expensive survey of UV reactors features. In this study, a CFD model utilizing the eulerian-lagrangian framework is applied to analysis the disinfection performance of a closed conduit reactor which contains four UV lamps perpendicular to the flow. A discrete ordinates (DO) model was employed to evaluate the UV irradiance field. To investigate the importance of each of lamps on the inactivation performance, in addition to the reference model (with 4 bright lamps), several models with one or two bright lamps in various arrangements were considered. All results were reported in three inactivation kinetics. The results showed that the log inactivation of the two central bright lamps model was between 88-99 percent, close to the reference model results. Also, whatever the lamps are closer to the main flow region, they have more effect on microbial inactivation. The effect of some operational parameters such as water flow rate, inlet water temperature, and lamps power were also studied.

Keywords: Eulerian-Lagrangian framework, inactivation kinetics, log inactivation, water UV reactor

Procedia PDF Downloads 237
6091 The Political Economy of the Global Climate Change Adaptation Initiatives: A Case Study on the Global Environmental Facility

Authors: Anar Koli

Abstract:

After the Paris agreement in 2015, a comprehensive initiative both from the developed and developing countries towards the adaptation to climate change is emerging. The Global Environmental Facility (GEF), which is financing a global portfolio of adaptation projects and programs in over 124 countries is playing a significant role to a new financing framework that included the concept of “climate-resilient development”. However, both the adaptation and sustainable development paradigms remain continuously contested, especially the role of the multilateral institutions with their technical and financial assistance to the developing world. Focusing on the adaptation initiatives of the GEF, this study aims to understand to what extent the global multilateral institutions, particularly the GEF is contributing to the climate-resilient development. From the political ecology perspective, the argument of this study is that the global financial framework is highly politicized, and understanding the contribution of the global institutions of the global climate change needs to be related both from the response and causal perspectives. A holistic perspective, which includes the contribution of the GEF as a response to the climate change and as well the cause of global climate change, are needed to understand the broader environment- political economic relation. The study intends to make a critical analysis of the way in which the political economy structure and the environment are related along with the social and ecological implications. It does not provide a narrow description of institutional responses to climate change, rather it looks at how the global institutions are influencing the relationship of the global ecologies and economies. This study thus developed a framework combining the global governance and the political economy perspective. This framework includes environment-society relation, environment-political economy linkage, global institutions as the orchestra, and division between the North and the South. Through the analysis of the GEF as the orchestra of the global governance, this study helps to understand how GEF is coordinating the interactions between the North and the South and responding the global climate resilient development. Through the other components of the framework, the study explains how the role of the global institutions is related to the cause of the human induced global climate change. The study employs a case study based on both the quantitative and qualitative data. Along with the GEF reports and data sets, this study draws from an eclectic range of literature from a range of disciplines to explain the broader relation of the environment and political economy. Based on a case study on GEF, the study found that the GEF has positive contributions in bringing developing countries’ capacity in terms of sustainable development goal, local institutional development. However, through a critical holistic analysis, this study found that this contribution to the resilient development helps the developing countries to conform the fossil fuel based capitalist political economy. The global governance institution is contributing both to the pro market based environment society relation and, to the consequences of this relation.

Keywords: climate change adaptation, global environmental facility (GEF), political economy, the north -south relation

Procedia PDF Downloads 211
6090 Cybersecurity Engineering BS Degree Curricula Design Framework and Assessment

Authors: Atma Sahu

Abstract:

After 9/11, there will only be cyberwars. The cyberwars increase in intensity the country's cybersecurity workforce's hiring and retention issues. Currently, many organizations have unfilled cybersecurity positions, and to a lesser degree, their cybersecurity teams are understaffed. Therefore, there is a critical need to develop a new program to help meet the market demand for cybersecurity engineers (CYSE) and personnel. Coppin State University in the United States was responsible for developing a cybersecurity engineering BS degree program. The CYSE curriculum design methodology consisted of three parts. First, the ACM Cross-Cutting Concepts standard's pervasive framework helped curriculum designers and students explore connections among the core courses' knowledge areas and reinforce the security mindset conveyed in them. Second, the core course context was created to assist students in resolving security issues in authentic cyber situations involving cyber security systems in various aspects of industrial work while adhering to the NIST standards framework. The last part of the CYSE curriculum design aspect was the institutional student learning outcomes (SLOs) integrated and aligned in content courses, representing more detailed outcomes and emphasizing what learners can do over merely what they know. The CYSE program's core courses express competencies and learning outcomes using action verbs from Bloom's Revised Taxonomy. This aspect of the CYSE BS degree program's design is based on these three pillars: the ACM, NIST, and SLO standards, which all CYSE curriculum designers should know. This unique CYSE curriculum design methodology will address how students and the CYSE program will be assessed and evaluated. It is also critical that educators, program managers, and students understand the importance of staying current in this fast-paced CYSE field.

Keywords: cyber security, cybersecurity engineering, systems engineering, NIST standards, physical systems

Procedia PDF Downloads 68
6089 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 513
6088 The Revenue Management Implementation and Its Complexity in the Airline Industry: An Empirical Study on the Egyptian Airline Industry

Authors: Amr Sultan, Sara Elgazzar, Breksal Elmiligy

Abstract:

The airline industry nowadays is becoming a more growing industry facing a severe competition. It is an influential issue in this context to utilize revenue management (RM) concept and practice in order to develop the pricing strategy. There is an unfathomable necessity for RM to assist the airlines and their associates to disparage the cost and recuperate their revenue, which in turn will boost the airline industry performance. The complexity of RM imposes enormous challenges on the airline industry. Several studies have been proposed on the RM adaptation in airlines industry while there is a limited availability of implementing RM and its complexity in the developing countries such as Egypt. This research represents a research schema about the implementation of the RM to the Egyptian airline industry. The research aims at investigating and demonstrating the complexities face implementing RM in the airline industry, up on which the research provides a comprehensive understanding of how to overcome these complexities while adapting RM in the Egyptian airline industry. An empirical study was conducted on the Egyptian airline sector based on a sample of four airlines (Egyptair, Britishair, KLM, and Lufthansa). The empirical study was conducted using a mix of qualitative and quantitative approaches. First, in-depth interviews were carried out to analyze the Egyptian airline sector status and the main challenges faced by the airlines. Then, a structured survey on the three different parties of airline industry; airlines, airfreight forwarders, and passengers were conducted in order to investigate the main complexity factors from different parties' points of view. Finally, a focus group was conducted to develop a best practice framework to overcome the complexities faced the RM adaptation in the Egyptian airline industry. The research provides an original contribution to knowledge by creating a framework to overcome the complexities and challenges in adapting RM in the airline industry generally and the Egyptian airline industry particularly. The framework can be used as a RM tool to increase the effectiveness and efficiency of the Egyptian airline industry performance.

Keywords: revenue management, airline industry, revenue management complexity, Egyptian airline industry

Procedia PDF Downloads 377
6087 Condition Optimization for Trypsin and Chymotrypsin Activities in Economic Animals

Authors: Mallika Supa-Aksorn, Buaream Maneewan, Jiraporn Rojtinnakorn

Abstract:

For animals, trypsin and chymotrypsin are the 2 proteases that play the important role in protein digestion and involving in growth rate. In many animals, these two enzymes are indicated as growth parameter by feed. Although enzyme assay at optimal condition is significant for its accuracy activity determination. There is less report of trypsin and chymotrypsin. Therefore, in this study, optimization of pH and temperature for trypsin (T) and chymotrypsin (C) in economic species; i.e. Nile tilapia (Oreochromis niloticus), sand goby (Oxyeleotoris marmoratus), giant freshwater prawn (Macrobachium rosenberchii) and native chicken (Gallus gallus) were investigated. Each enzyme of each species was assaying for its specific activity with variation of pH in range of 2-12 and temperature in range of 30-80 °C. It revealed that, for Nile tilapia, T had optimal condition at pH 9 and temperature 50-80 °C, whereas C had optimal condition at pH 8 and temperature 60 °C. For sand goby, T had optimal condition at pH 7 and temperature of 50 °C, while C had optimal condition at pH 11 and temperature of 70-75 °C. For juvenile freshwater prawn, T had optimal condition at pH 10-11 and temperature of 60-65 °C, C had optimal condition at pH 8 and temperature of 70°C. For starter native chicken, T has optimal condition at pH 7 and temperature of 70 °C, whereas C had o optimal condition at pH 8 and temperature of 60°C. This information of optimal conditions will be high valuable in further for, actual enzyme measurement of T and C activities that benefit for growth and feed analysis.

Keywords: trypsin, chymotrypsin, Oreochromis niloticus, Oxyeleotoris marmoratus, Macrobachium rosenberchii, Gallus gallus

Procedia PDF Downloads 244
6086 Impact of Marketing Orientation on Environment and Firm’s Performance

Authors: Sabita Mahapatra

Abstract:

‘Going green’ has been an emerging issue worldwide driving companies to continuously enhance their green capabilities and implement innovative green practices to protect the environment and improve business performance. Green has become a contemporary business environmental issue. The resource advantage theory is adopted in the present study to observe the impact of marketing orientation and green innovation practices on environmental and firm’s performance. The small and medium firms compared to large firms have different approach towards market orientation as a strategic tool. The present study proposes a conceptual framework regarding the impact of market orientation on environmental and firm’s performance through green innovation practices in the context of small and medium scale industries (SMEs). The propositions developed in the present paper would provide scope for future research study to validate the conceptual framework in the emerging economy like India.

Keywords: market orientation, green innovation practices, environment performance, corporate performance, emerging market

Procedia PDF Downloads 306
6085 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms

Authors: Rikson Gultom

Abstract:

Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.

Keywords: abusive language, hate speech, machine learning, optimization, social media

Procedia PDF Downloads 113
6084 Solar Building Design Using GaAs PV Cells for Optimum Energy Consumption

Authors: Hadis Pouyafar, D. Matin Alaghmandan

Abstract:

Gallium arsenide (GaAs) solar cells are widely used in applications like spacecraft and satellites because they have a high absorption coefficient and efficiency and can withstand high-energy particles such as electrons and protons. With the energy crisis, there's a growing need for efficiency and cost-effective solar cells. GaAs cells, with their 46% efficiency compared to silicon cells 23% can be utilized in buildings to achieve nearly zero emissions. This way, we can use irradiation and convert more solar energy into electricity. III V semiconductors used in these cells offer performance compared to other technologies available. However, despite these advantages, Si cells dominate the market due to their prices. In our study, we took an approach by using software from the start to gather all information. By doing so, we aimed to design the optimal building that harnesses the full potential of solar energy. Our modeling results reveal a future; for GaAs cells, we utilized the Grasshopper plugin for modeling and optimization purposes. To assess radiation, weather data, solar energy levels and other factors, we relied on the Ladybug and Honeybee plugins. We have shown that silicon solar cells may not always be the choice for meeting electricity demands, particularly when higher power output is required. Therefore, when it comes to power consumption and the available surface area for photovoltaic (PV) installation, it may be necessary to consider efficient solar cell options, like GaAs solar cells. By considering the building requirements and utilizing GaAs technology, we were able to optimize the PV surface area.

Keywords: gallium arsenide (GaAs), optimization, sustainable building, GaAs solar cells

Procedia PDF Downloads 69
6083 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground

Authors: Bhim Kumar Dahal

Abstract:

Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies.  Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication.  And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.

Keywords: cement, improvement, physical properties, strength

Procedia PDF Downloads 161
6082 A Generative Adversarial Framework for Bounding Confounded Causal Effects

Authors: Yaowei Hu, Yongkai Wu, Lu Zhang, Xintao Wu

Abstract:

Causal inference from observational data is receiving wide applications in many fields. However, unidentifiable situations, where causal effects cannot be uniquely computed from observational data, pose critical barriers to applying causal inference to complicated real applications. In this paper, we develop a bounding method for estimating the average causal effect (ACE) under unidentifiable situations due to hidden confounders. We propose to parameterize the unknown exogenous random variables and structural equations of a causal model using neural networks and implicit generative models. Then, with an adversarial learning framework, we search the parameter space to explicitly traverse causal models that agree with the given observational distribution and find those that minimize or maximize the ACE to obtain its lower and upper bounds. The proposed method does not make any assumption about the data generating process and the type of the variables. Experiments using both synthetic and real-world datasets show the effectiveness of the method.

Keywords: average causal effect, hidden confounding, bound estimation, generative adversarial learning

Procedia PDF Downloads 173
6081 The Role of Learning in Stimulation Policies to Increase Participation in Lifelong Development: A Government Policy Analysis

Authors: Björn de Kruijf, Arjen Edzes, Sietske Waslander

Abstract:

In an ever-quickly changing society, lifelong development is seen as a solution to labor market problems by politicians and policymakers. In this paper, we investigate how policy instruments are used to increase participation in lifelong development and on which behavioral principles policy is based. Digitization, automation, and an aging population change society and the labor market accordingly. Skills that were once most sought after in the workforce can become abundantly present. For people to remain relevant in the working population, they need to continue adapting new skills useful in the current labor market. Many reports have been written that focus on the role of lifelong development in this changing society and how lifelong development can help keep people adapt and stay relevant. Inspired by these reports, governments have implemented a broad range of policies to support participation in lifelong development. The question we ask ourselves is how government policies promote participation in lifelong development. This stems from a complex interplay of policy instruments and learning. Regulation, economic and soft instruments can be combined to promote lifelong development, and different types of education further complex policies on lifelong development. Literature suggests that different stages in people’s lives might warrant different methods of learning. Governments could anticipate this in their policies. In order to influence people’s behavior, the government can tap into a broad range of sociological, psychological, and (behavioral) economic principles. The traditional economic assumption that behavior is rational is known to be only partially true, and the government can use many biases in human behavior to stimulate participation in lifelong development. In this paper, we also try to find which biases the government taps into to promote participation if they tap into any of these biases. The goal of this paper is to analyze government policies intended to promote participation in lifelong development. To do this, we develop a framework to analyze the policies on lifelong development. We specifically incorporate the role of learning and the behavioral principles underlying policy instruments in the framework. We apply this framework to the case of the Netherlands, where we examine a set of policy documents. We single out the policies the government has put in place and how they are vertically and horizontally related. Afterward, we apply the framework and classify the individual policies by policy instrument and by type of learning. We find that the Dutch government focuses on formal and non-formal learning in their policy instruments. However, the literature suggests that learning at a later age is mainly done in an informal manner through experiences.

Keywords: learning, lifelong development, policy analysis, policy instruments

Procedia PDF Downloads 70
6080 Identifying Strategies and Techniques for the Egyptian Medium and Large Size Contractors to Respond to Economic Hardship

Authors: Michael Salib, Samer Ezeldin, Ahmed Waly

Abstract:

There are numerous challenges and problems facing the construction industry in several countries in the Middle East, as a result of numerous economic and political effects. As an example in Egypt, several construction companies have shut down and left the market since 2016. The closure of these companies occurred, as they did not respond with the suitable techniques and strategies that will enable them to survive during this economic turmoil period. A research is conducted in order to identify adequate strategies to be implemented by the Egyptian contractors that could allow them survive and keep competing during such economic hardship period. Two different techniques were used in order to identify these startegies. First, a deep research were conducted on the companies located in countries that suffered similar economic harship to identify the strategies they used in order to survive. Second, interviews were conducted with experts in the construction field in order to list the effective strategies they used that allowed them to survive. Moreover, at the end of each interview, the experts were asked to rate the applicability of the previously identified strategies used in the foreign countries, then the efficiency of each strategy if used in Egypt. A framework model is developed in order to assist the construction companies in choosing the suitable techniques to their company size, through identifying the top ranked strategies and techniques that should be adopted by the company based on the parameters given to the model. In order to verify this framework, the financial statements of two leading companies in the Egyptian construction market were studied. The first Contractor has applied nearly all the top ranked strategies identified in this paper, while the other contractor has applied only few of the identified top ranked strategies. Finally, another expert interviews were conducted in order to validate the framework. These experts were asked to test the model and rate through a questionnaire its applicability and effectiveness.

Keywords: construction management, economic hardship, recession, survive

Procedia PDF Downloads 114
6079 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning

Authors: Federico Pittino, Thomas Arnold

Abstract:

The shredding of waste materials is a key step in the recycling process towards the circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need for frequent maintenance of critical components. Maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for one year, and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for the efficient operation of industrial shredders.

Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning

Procedia PDF Downloads 107
6078 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: data analytics, green production, industrial energy management, optimization, renewable energies, simulation

Procedia PDF Downloads 422
6077 Multi-source Question Answering Framework Using Transformers for Attribute Extraction

Authors: Prashanth Pillai, Purnaprajna Mangsuli

Abstract:

Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.

Keywords: natural language processing, deep learning, transformers, information retrieval

Procedia PDF Downloads 182
6076 Integrated Performance Management System a Conceptual Design for PT. XYZ

Authors: Henrie Yunianto, Dermawan Wibisono

Abstract:

PT. XYZ is a family business (private company) in Indonesia that provide an educational program and consultation services. Since its establishment in 2011, the company has run without any strategic management system implemented. Though the company could survive until now. The management of PT. XYZ sees the business opportunity for such product is huge, even though the targeted market is very specific (niche), the volume is large (due to large population of Indonesia) and numbers of competitors are low (now). It can be said if the product life cycle is in between ‘Introduction stage’ and ‘growth’ stage. It is observed that nowadays the new entrants (competitors) are increasing, thus PT. XYZ consider reacting in facing the intense business rivalry by conducting the business in an appropriate manner. A Performance Management System is important to be implemented in accordance with the business sustainability and growth. The framework of Performance Management System chosen is Integrated Performance Management System (IPMS). IPMS framework has the advantages of its simplicity, linkage between its business variables and indicators where the company can see the connections between all factors measured. IPMS framework consists of perspectives: (1) Business Result, (2) Internal Processes, (3) Resource Availability. Variables and indicators were examined through deep analysis of the business external and internal environments, Strength-Weakness-Opportunity-Threat (SWOT) analysis, Porter’s five forces analysis. Analytical Hierarchy Process (AHP) analysis was then used to quantify the weight of each variable/indicators. AHP is needed since in this study, PT. XYZ, the data of existing performance indicator was not available. Later, where the IPMS is implemented, the real data measured can be examined to determine the weight factor of each indicators using correlation analysis (or other methods). In this study of IPMS design for PT. XYZ, the analysis shows that with current company goals, along with the AHP methodology, the critical indicators for each perspective are: (1) Business results: Customer satisfaction and Employee satisfaction, (2) Internal process: Marketing performance, Supplier quality, Production quality, Continues improvement; (3) Resources Availability: Leadership and company culture & value, Personal Competences, Productivity. Company and/or organization require performance management system to help them in achieving their vision and mission. Company strategy will be effectively defined and addressed by using performance management system. Integrated Performance Management System (IPMS) framework and AHP analysis help us in quantifying the factors which influence the business output expected.

Keywords: analytical hierarchy process, business strategy, differentiation strategy, integrated performance management system

Procedia PDF Downloads 291
6075 Urban Open Source: Synthesis of a Citizen-Centric Framework to Design Densifying Cities

Authors: Shaurya Chauhan, Sagar Gupta

Abstract:

Prominent urbanizing centres across the globe like Delhi, Dhaka, or Manila have exhibited that development often faces a challenge in bridging the gap among the top-down collective requirements of the city and the bottom-up individual aspirations of the ever-diversifying population. When this exclusion is intertwined with rapid urbanization and diversifying urban demography: unplanned sprawl, poor planning, and low-density development emerge as automated responses. In parallel, new ideas and methods of densification and public participation are being widely adopted as sustainable alternatives for the future of urban development. This research advocates a collaborative design method for future development: one that allows rapid application with its prototypical nature and an inclusive approach with mediation between the 'user' and the 'urban', purely with the use of empirical tools. Building upon the concepts and principles of 'open-sourcing' in design, the research establishes a design framework that serves the current user requirements while allowing for future citizen-driven modifications. This is synthesized as a 3-tiered model: user needs – design ideology – adaptive details. The research culminates into a context-responsive 'open source project development framework' (hereinafter, referred to as OSPDF) that can be used for on-ground field applications. To bring forward specifics, the research looks at a 300-acre redevelopment in the core of a rapidly urbanizing city as a case encompassing extreme physical, demographic, and economic diversity. The suggestive measures also integrate the region’s cultural identity and social character with the diverse citizen aspirations, using architecture and urban design tools, and references from recognized literature. This framework, based on a vision – feedback – execution loop, is used for hypothetical development at the five prevalent scales in design: master planning, urban design, architecture, tectonics, and modularity, in a chronological manner. At each of these scales, the possible approaches and avenues for open- sourcing are identified and validated, through hit-and-trial, and subsequently recorded. The research attempts to re-calibrate the architectural design process and make it more responsive and people-centric. Analytical tools such as Space, Event, and Movement by Bernard Tschumi and Five-Point Mental Map by Kevin Lynch, among others, are deep rooted in the research process. Over the five-part OSPDF, a two-part subsidiary process is also suggested after each cycle of application, for a continued appraisal and refinement of the framework and urban fabric with time. The research is an exploration – of the possibilities for an architect – to adopt the new role of a 'mediator' in development of the contemporary urbanity.

Keywords: open source, public participation, urbanization, urban development

Procedia PDF Downloads 134
6074 Interval Bilevel Linear Fractional Programming

Authors: F. Hamidi, N. Amiri, H. Mishmast Nehi

Abstract:

The Bilevel Programming (BP) model has been presented for a decision making process that consists of two decision makers in a hierarchical structure. In fact, BP is a model for a static two person game (the leader player in the upper level and the follower player in the lower level) wherein each player tries to optimize his/her personal objective function under dependent constraints; this game is sequential and non-cooperative. The decision making variables are divided between the two players and one’s choice affects the other’s benefit and choices. In other words, BP consists of two nested optimization problems with two objective functions (upper and lower) where the constraint region of the upper level problem is implicitly determined by the lower level problem. In real cases, the coefficients of an optimization problem may not be precise, i.e. they may be interval. In this paper we develop an algorithm for solving interval bilevel linear fractional programming problems. That is to say, bilevel problems in which both objective functions are linear fractional, the coefficients are interval and the common constraint region is a polyhedron. From the original problem, the best and the worst bilevel linear fractional problems have been derived and then, using the extended Charnes and Cooper transformation, each fractional problem can be reduced to a linear problem. Then we can find the best and the worst optimal values of the leader objective function by two algorithms.

Keywords: best and worst optimal solutions, bilevel programming, fractional, interval coefficients

Procedia PDF Downloads 431
6073 A Grounded Theory on Marist Spirituality/Charism from the Perspective of the Lay Marists in the Philippines

Authors: Nino M. Pizarro

Abstract:

To the author’s knowledge, despite the written documents about Marist spirituality/charism, nothing has been done concerning a clear theoretical framework that highlights Marist spirituality/charism from the perspective or lived experience of the lay Marists of St. Marcellin Champagnat. The participants of the study are the lay Marist - educators who are from Marist Schools in the Philippines. Since the study would like to find out the respondents’ own concepts and meanings about Marist spirituality/charism, qualitative methodology is considered the approach to be used in the study. In particular, the study will use the qualitative methods of Barney Glaser. The theory will be generated systematically from data collection, coding and analyzing through memoing, theoretical sampling, sorting and writing and using the constant comparative method. The data collection method that will be employed in this grounded theory research is the in-depth interview that is semi-structured and participant driven. Data collection will be done through snowball sampling that is purposive. The study is considering to come up with a theoretical framework that will help the lay Marists to deepen their understanding of the Marist spirituality/charism and their vocation as lay partners of the Marist Brothers of the Schools.

Keywords: grounded theory, Lay Marists, lived experience, Marist spirituality/charism

Procedia PDF Downloads 293
6072 Significant Reduction in Specific CO₂ Emission through Process Optimization at G Blast Furnace, Tata Steel Jamshedpur

Authors: Shoumodip Roy, Ankit Singhania, M. K. G. Choudhury, Santanu Mallick, M. K. Agarwal, R. V. Ramna, Uttam Singh

Abstract:

One of the key corporate goals of Tata Steel company is to demonstrate Environment Leadership. Decreasing specific CO₂ emission is one of the key steps to achieve the stated corporate goal. At any Blast Furnace, specific CO₂ emission is directly proportional to fuel intake. To reduce the fuel intake at G Blast Furnace, an initial benchmarking exercise was carried out with international and domestic Blast Furnaces to determine the potential for improvement. The gap identified during the exercise revealed that the benchmark Blast Furnaces operated with superior raw material quality than that in G Blast Furnace. However, since the raw materials to G Blast Furnace are sourced from the captive mines, improvement in the raw material quality was out of scope. Therefore, trials were taken with different operating regimes, to identify the key process parameters, which on optimization could significantly reduce the fuel intake in G Blast Furnace. The key process parameters identified from the trial were the Stoichiometric Oxygen Ratio, Melting Capacity ratio and the burden distribution inside the furnace. These identified process parameters were optimized to bridge the gap in fuel intake at G Blast Furnace, thereby reducing specific CO₂ emission to benchmark levels. This paradigm shift enabled to lower the fuel intake by 70kg per ton of liquid iron produced, thereby reducing the specific CO₂ emission by 15 percent.

Keywords: benchmark, blast furnace, CO₂ emission, fuel rate

Procedia PDF Downloads 259