Search results for: global innovation network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10792

Search results for: global innovation network

6232 The Effect of Catastrophic Losses on Insurance Cycle: Case of Croatia

Authors: Drago Jakovčević, Maja Mihelja Žaja

Abstract:

This paper provides an analysis of the insurance cycle in the Republic of Croatia and whether they are affected by catastrophic losses on a global level. In general, it is considered that insurance cycles are particularly pronounced in periods of financial crisis, but are also affected by the growing number of catastrophic losses. They cause the change of insurance cycle and premium growth and intensification and narrowing of the coverage conditions, so these variables move in the same direction and these phenomena point to a new cycle. The main goal of this paper is to determine the existence of insurance cycle in the Republic of Croatia and investigate whether catastrophic losses have an influence on insurance cycles.

Keywords: catastrophic loss, insurance cycle, premium, Republic of Croatia

Procedia PDF Downloads 339
6231 Traditional Practices of Conserving Biodiversity: A Case Study around Jim Corbett National Park, Uttarakhand, India

Authors: Rana Parween, Rob Marchant

Abstract:

With the continued loss of global biodiversity despite the application of modern conservation techniques, it has become crucial to investigate non-conventional methods. Accelerated destruction of ecosystems due to altered land use, climate change, cultural and social change, necessitates the exploration of society-biodiversity attitudes and links. While the loss of species and their extinction is a well-known and well-documented process that attracts much-needed attention from researchers, academics, government and non-governmental organizations, the loss of traditional ecological knowledge and practices is more insidious and goes unnoticed. The growing availability of 'indirect experiences' such as the internet and media are leading to a disaffection towards nature and the 'Extinction of Experience'. Exacerbated by the lack of documentation of traditional practices and skills, there is the possibility for the 'extinction' of traditional practices and skills before they are fully recognized and captured. India, as a mega-biodiverse country, is also known for its historical conservation strategies entwined in traditional beliefs. Indigenous communities hold skillsets, knowledge, and traditions that have accumulated over multiple generations and may play an important role in conserving biodiversity today. This study explores the differences in knowledge and attitudes towards conserving biodiversity, of three different stakeholder groups living around Jim Corbett National Park, based on their age, traditions, and association with the protected area. A triangulation designed multi-strategy investigation collected qualitative and quantitative data through a questionnaire survey of village elders, the general public, and forest officers. Following an inductive approach to analyzing qualitative data, the thematic content analysis was followed. All coding and analysis were completed using NVivo 11. Although the village elders and some general public had vast amounts of traditional knowledge, most of it was related to animal husbandry and the medicinal value of plants. Village elders were unfamiliar with the concept of the term ‘biodiversity’ albeit their way of life and attitudes ensured that they care for the ecosystem without having the scientific basis underpinning biodiversity conservation. Inherently, village elders were keen to conserve nature; the superimposition of governmental policies without any tangible benefit or consultation was seen as detrimental. Alienating villagers and consequently the village elders who are the reservoirs of traditional knowledge would not only be damaging to the social network of the area but would also disdain years of tried and tested techniques held by the elders. Forest officers advocated for biodiversity and conservation education for women and children. Women, across all groups, when questioned about nature conservation, showed more interest in learning and participation. Biodiversity not only has an ethical and cultural value, but also plays a role in ecosystem function and, thus, provides ecosystem services and supports livelihoods. Therefore, underpinning and using traditional knowledge and incorporating them into programs of biodiversity conservation should be explored with a sense of urgency.

Keywords: biological diversity, mega-biodiverse countries, traditional ecological knowledge, society-biodiversity links

Procedia PDF Downloads 90
6230 Exploration of Critical Success Factors in Business and Management in Artificial Intelligence Era

Authors: Najah Kalifah Almazmomi

Abstract:

In the time of artificial intelligence (AI), there is a need to know the determinants of success in business management, which are taking on a new dimension. This research purports to scrutinize the Critical Success Factors (CSFs) that drive and ignite the fire of success to help uncover the subtle and profound dynamics that might be operative in organizations. By means of a systematic literature review and a number of empirical methods, the paper is aimed at determining and assessing the key aspects of CSFs, putting emphasis on their role and meaning in the context of AI technology adoption. Some central features such as leadership ways, innovation models, strategic thinking methodologies, organizational culture transformations, and human resource management approaches are compared and contrasted with the AI-driven revolution. Additionally, this research will explore the interactive effects of these factors and their joint impact on the success, survival, and flexibility of a business in the current environment, which is changing due to AI development. Through the use of different qualitative and quantitative methodologies, the research concludes that the findings are significant in understanding the relative roles of individual CSFs and in studying the interactions between them in such an AI-enabled business environment.

Keywords: critical success factors, business and management, artificial intelligence, leadership strategies

Procedia PDF Downloads 22
6229 Evaluation of the Biological Activity of New Antimicrobial and Biodegradable Textile Materials for Protective Equipment

Authors: Safa Ladhari, Alireza Saidi, Phuong Nguyen-Tri

Abstract:

During health crises, such as COVID-19, using disposable protective equipment (PEs) (masks, gowns, etc.) causes long-term problems, increasing the volume of hazardous waste that must be handled safely and expensively. Therefore, producing textiles for antimicrobial and reusable materials is highly desirable to decrease the use of disposable PEs that should be treated as hazardous waste. In addition, if these items are used regularly in the workplace or for daily activities by the public, they will most likely end up in household waste. Furthermore, they may pose a high risk of contagion to waste collection workers if contaminated. Therefore, to protect the whole population in times of sanitary crisis, it is necessary to equip these materials with tools that make them resilient to the challenges of carrying out daily activities without compromising public health and the environment and without depending on them external technologies and producers. In addition, the materials frequently used for EPs are plastics of petrochemical origin. The subject of the present work is replacing petroplastics with bioplastic since it offers better biodegradability. The chosen polymer is polyhydroxybutyrate (PHB), a family of polyhydroxyalkanoates synthesized by different bacteria. It has similar properties to conventional plastics. However, it is renewable, biocompatible, and has attractive barrier properties compared to other polyesters. These characteristics make it ideal for EP protection applications. The current research topic focuses on the preparation and rapid evaluation of the biological activity of nanotechnology-based antimicrobial agents to treat textile surfaces used for PE. This work will be carried out to provide antibacterial solutions that can be transferred to a workplace application in the fight against short-term biological risks. Three main objectives are proposed during this research topic: 1) the development of suitable methods for the deposition of antibacterial agents on the surface of textiles; 2) the development of a method for measuring the antibacterial activity of the prepared textiles and 3) the study of the biodegradability of the prepared textiles. The studied textile is a non-woven fabric based on a biodegradable polymer manufactured by the electrospinning method. Indeed, nanofibers are increasingly studied due to their unique characteristics, such as high surface-to-volume ratio, improved thermal, mechanical, and electrical properties, and confinement effects. The electrospun film will be surface modified by plasma treatment and then loaded with hybrid antibacterial silver and titanium dioxide nanoparticles by the dip-coating method. This work uses simple methods with emerging technologies to fabricate nanofibers with suitable size and morphology to be used as components for protective equipment. The antibacterial agents generally used are based on silver, zinc, copper, etc. However, to our knowledge, few researchers have used hybrid nanoparticles to ensure antibacterial activity with biodegradable polymers. Also, we will exploit visible light to improve the antibacterial effectiveness of the fabric, which differs from the traditional contact mode of killing bacteria and presents an innovation of active protective equipment. Finally, this work will allow for the innovation of new antibacterial textile materials through a simple and ecological method.

Keywords: protective equipment, antibacterial textile materials, biodegradable polymer, electrospinning, hybrid antibacterial nanoparticles

Procedia PDF Downloads 60
6228 Application of Natural Language Processing in Education

Authors: Khaled M. Alhawiti

Abstract:

Reading capability is a major segment of language competency. On the other hand, discovering topical writings at a fitting level for outside and second language learners is a test for educators. We address this issue utilizing natural language preparing innovation to survey reading level and streamline content. In the connection of outside and second-language learning, existing measures of reading level are not appropriate to this errand. Related work has demonstrated the profit of utilizing measurable language preparing procedures; we expand these thoughts and incorporate other potential peculiarities to measure intelligibility. In the first piece of this examination, we join characteristics from measurable language models, customary reading level measures and other language preparing apparatuses to deliver a finer technique for recognizing reading level. We examine the execution of human annotators and assess results for our finders concerning human appraisals. A key commitment is that our identifiers are trainable; with preparing and test information from the same space, our finders beat more general reading level instruments (Flesch-Kincaid and Lexile). Trainability will permit execution to be tuned to address the needs of specific gatherings or understudies.

Keywords: natural language processing, trainability, syntactic simplification tools, education

Procedia PDF Downloads 474
6227 Sensitivity Analysis of the Heat Exchanger Design in Net Power Oxy-Combustion Cycle for Carbon Capture

Authors: Hirbod Varasteh, Hamidreza Gohari Darabkhani

Abstract:

The global warming and its impact on climate change is one of main challenges for current century. Global warming is mainly due to the emission of greenhouse gases (GHG) and carbon dioxide (CO2) is known to be the major contributor to the GHG emission profile. Whilst the energy sector is the primary source for CO2 emission, Carbon Capture and Storage (CCS) are believed to be the solution for controlling this emission. Oxyfuel combustion (Oxy-combustion) is one of the major technologies for capturing CO2 from power plants. For gas turbines, several Oxy-combustion power cycles (Oxyturbine cycles) have been investigated by means of thermodynamic analysis. NetPower cycle is one of the leading oxyturbine power cycles with almost full carbon capture capability from a natural gas fired power plant. In this manuscript, sensitivity analysis of the heat exchanger design in NetPower cycle is completed by means of process modelling. The heat capacity variation and supercritical CO2 with gaseous admixtures are considered for multi-zone analysis with Aspen Plus software. It is found that the heat exchanger design has a major role to increase the efficiency of NetPower cycle. The pinch-point analysis is done to extract the composite and grand composite curve for the heat exchanger. In this paper, relationship between the cycle efficiency and the minimum approach temperature (∆Tmin) of the heat exchanger has also been evaluated.  Increase in ∆Tmin causes a decrease in the temperature of the recycle flue gases (RFG) and an overall decrease in the required power for the recycled gas compressor. The main challenge in the design of heat exchangers in power plants is a tradeoff between the capital and operational costs. To achieve lower ∆Tmin, larger size of heat exchanger is required. This means a higher capital cost but leading to a better heat recovery and lower operational cost. To achieve this, ∆Tmin is selected from the minimum point in the diagrams of capital and operational costs. This study provides an insight into the NetPower Oxy-combustion cycle’s performance analysis and operational condition based on its heat exchanger design.

Keywords: carbon capture and storage, oxy-combustion, netpower cycle, oxy turbine cycles, zero emission, heat exchanger design, supercritical carbon dioxide, oxy-fuel power plant, pinch point analysis

Procedia PDF Downloads 192
6226 Conflicts of Interest in the Private Sector and the Significance of the Public Interest Test

Authors: Opemiposi Adegbulu

Abstract:

Conflicts of interest is an elusive, diverse and engaging subject, a cross-cutting problem of governance; all levels of governance, ranging from local to global, public to corporate or financial sectors. In all these areas, its mismanagement could lead to the distortion of decision-making processes, corrosion of trust and the weakening of administration. According to Professor Peters, an expert in the area, conflict of interest, a problem at the root of many scandals has “become a pervasive ethical concern in our professional, organisational, and political life”. Conflicts of interest corrode trust, and like in the public sector, trust is mandatory for the market, consumers/clients, shareholders and other stakeholders in the private sector. However, conflicts of interest in the private sector are distinct and must be treated in like manner when regulatory efforts are made to address them. The research looks at identifying conflicts of interest in the private sector and differentiating them from those in the public sector. The public interest is submitted as a criterion which allows for such differentiation. This is significant because it would for the use of tailor-made or sector-specific approaches to addressing this complex issue. This is conducted through extensive review of literature and theories on the definition of conflicts of interest. This study will employ theoretical, doctrinal and comparative methods. The nature of conflicts of interest in the private sector will be explored, through an analysis of the public sector where the notion of conflicts of interest appears more clearly identified, reasons, why they are of business ethics concern, will be advanced, and then, once again, looking at public sector solutions and other solutions, the study will identify ways of mitigating and managing conflicts in the private sector. An exploration of public sector conflicts of interest and solutions will be carried out because the typologies of conflicts of interest in both sectors appear very similar at the core and thus, lessons can be learnt with regards to the management of these issues in the private sector. Conflicts of interest corrode trust, and like in the public sector, trust is mandatory for the market, consumers/clients, shareholders and other stakeholders in the private sector. This research will then focus on some specific challenges to understanding and identifying conflicts of interest in the private sector; origin, diverging theories, the psychological barrier to the definition, similarities with public sector conflicts of interest due to the notions of corrosion of trust, ‘being in a particular kind of situation,’ etc. The notion of public interest will be submitted as a key element at the heart of the distinction between public sector and private sector conflicts of interests. It will then be proposed that the appreciation of the notion of conflicts of interest differ according to sector, country to country, based on the public interest test, using the United Kingdom (UK), the United States of America (US), France and the Philippines as illustrations.

Keywords: conflicts of interest, corporate governance, global governance, public interest

Procedia PDF Downloads 377
6225 Cleaning of Polycyclic Aromatic Hydrocarbons (PAH) Obtained from Ferroalloys Plant

Authors: Stefan Andersson, Balram Panjwani, Bernd Wittgens, Jan Erik Olsen

Abstract:

Polycyclic Aromatic hydrocarbons are organic compounds consisting of only hydrogen and carbon aromatic rings. PAH are neutral, non-polar molecules that are produced due to incomplete combustion of organic matter. These compounds are carcinogenic and interact with biological nucleophiles to inhibit the normal metabolic functions of the cells. Norways, the most important sources of PAH pollution is considered to be aluminum plants, the metallurgical industry, offshore oil activity, transport, and wood burning. Stricter governmental regulations regarding emissions to the outer and internal environment combined with increased awareness of the potential health effects have motivated Norwegian metal industries to increase their efforts to reduce emissions considerably. One of the objective of the ongoing industry and Norwegian research council supported "SCORE" project is to reduce potential PAH emissions from an off gas stream of a ferroalloy furnace through controlled combustion. In a dedicated combustion chamber. The sizing and configuration of the combustion chamber depends on the combined properties of the bulk gas stream and the properties of the PAH itself. In order to achieve efficient and complete combustion the residence time and minimum temperature need to be optimized. For this design approach reliable kinetic data of the individual PAH-species and/or groups thereof are necessary. However, kinetic data on the combustion of PAH are difficult to obtain and there is only a limited number of studies. The paper presents an evaluation of the kinetic data for some of the PAH obtained from literature. In the present study, the oxidation is modelled for pure PAH and also for PAH mixed with process gas. Using a perfectly stirred reactor modelling approach the oxidation is modelled including advanced reaction kinetics to study influence of residence time and temperature on the conversion of PAH to CO2 and water. A Chemical Reactor Network (CRN) approach is developed to understand the oxidation of PAH inside the combustion chamber. Chemical reactor network modeling has been found to be a valuable tool in the evaluation of oxidation behavior of PAH under various conditions.

Keywords: PAH, PSR, energy recovery, ferro alloy furnace

Procedia PDF Downloads 258
6224 A Social Network Analysis for Formulating Construction Defect Generation Mechanisms

Authors: Hamad Aljassmi, Sangwon Han

Abstract:

Various solutions for preventing construction defects have been suggested. However, a construction company may have difficulties adopting all these suggestions due to financial and practical constraints. Based on this recognition, this paper aims to identify the most significant defect causes and formulate their defect generation mechanism in order to help a construction company to set priorities of its defect prevention strategies. For this goal, we conducted a questionnaire survey of 106 industry professionals and identified five most significant causes including: (1) organizational culture, (2) time pressure and constraints, (3) workplace quality system, (4) financial constraints upon operational expenses and (5) inadequate employee training or learning opportunities.

Keywords: defect, quality, failure, risk

Procedia PDF Downloads 607
6223 A Study on Cleaning Mirror Technology with Reduced Water Consumption in a Solar Thermal Power Plant

Authors: Bayarjargal Enkhtaivan, Gao Wei, Zhang Yanping, He Guo Qiang

Abstract:

In our study, traditional cleaning mirror technology with reduced consumption of water in solar thermal power plants is investigated. In developed countries, a significant increase of growth and innovation in solar thermal power sector is evident since over the last decade. These power plants required higher water consumption, however, there are some complications to construct and operate such power plants under severe drought-inflicted areas like deserts where high water-deficit can be seen but sufficient solar energy is available. Designing new experimental equipments is the most important advantage of this study. These equipments can estimate various types of measurements at the mean time. In this study, Glasses were placed for 10 and 20 days at certain positions to deposit dusts on glass surface by using a common method. Dust deposited on glass surface was washed by experimental equipment and measured dust deposition on each glass. After that, experimental results were analyzed and concluded.

Keywords: concentrated solar power (CSP) plant, high-pressure water, test equipment of clean mirror, cleaning technology of glass and mirror

Procedia PDF Downloads 161
6222 Satellite Multispectral Remote Sensing of Ozone Pollution

Authors: Juan Cuesta

Abstract:

Satellite observation is a fundamental component of air pollution monitoring systems, such as the large-scale Copernicus Programme. Next-generation satellite sensors, in orbit or programmed in the future, offer great potential to observe major air pollutants, such as tropospheric ozone, with unprecedented spatial and temporal coverage. However, satellite approaches developed for remote sensing of tropospheric ozone are based solely on measurements from a single instrument in a specific spectral range, either thermal infrared or ultraviolet. These methods offer sensitivity to tropospheric ozone located at the lowest at 3 or 4 km altitude above the surface, thus limiting their applications for ozone pollution analysis. Indeed, no current observation of a single spectral domain provides enough information to accurately measure ozone in the atmospheric boundary layer. To overcome this limitation, we have developed a multispectral synergism approach, called "IASI+GOME2", at the Laboratoire Interuniversitaire des Systèmes Atmosphériques (LISA) laboratory. This method is based on the synergy of thermal infrared and ultraviolet observations of respectively the Infrared Atmospheric Sounding Interferometer (IASI) and the Global Ozone Monitoring Experiment-2 (GOME-2) sensors embedded in MetOp satellites that have been in orbit since 2007. IASI+GOME2 allowed the first satellite observation of ozone plumes located between the surface and 3 km of altitude (what we call the lowermost troposphere), as it offers significant sensitivity in this layer. This represents a major advance for the observation of ozone in the lowermost troposphere and its application to air quality analysis. The ozone abundance derived by IASI+GOME2 shows a good agreement with respect to independent observations of ozone based on ozone sondes (a low mean bias, a linear correlation larger than 0.8 and a mean precision of about 16 %) around the world during all seasons. Using IASI+GOME2, lowermost tropospheric ozone pollution plumes are quantified both in terms of concentrations and also in the amounts of ozone photo-chemically produced along transport and also enabling the characterization of the ozone pollution, such as what occurred during the lockdowns linked to the COVID-19 pandemic. The current paper will show the IASI+GOME2 multispectral approach to observe the lowermost tropospheric ozone from space and an overview of several applications on different continents and at a global scale.

Keywords: ozone pollution, multispectral synergism, satellite, air quality

Procedia PDF Downloads 65
6221 Survey: Topology Hiding in Multipath Routing Protocol in MANET

Authors: Akshay Suhas Phalke, Manohar S. Chaudhari

Abstract:

In this paper, we have discussed the multipath routing with its variants. Our purpose is to discuss the different types of the multipath routing mechanism. Here we also put the taxonomy of the multipath routing. Multipath routing is used for the alternate path routing, reliable transmission of data and for better utilization of network resources. We also discussed the multipath routing for topology hiding such as TOHIP. In multipath routing, different parameters such as energy efficiency, packet delivery ratio, shortest path routing, fault tolerance play an important role. We have discussed a number of multipath routing protocol based on different parameters lastly.

Keywords: multi-path routing, WSN, topology, fault detection, trust

Procedia PDF Downloads 334
6220 A Local Tensor Clustering Algorithm to Annotate Uncharacterized Genes with Many Biological Networks

Authors: Paul Shize Li, Frank Alber

Abstract:

A fundamental task of clinical genomics is to unravel the functions of genes and their associations with disorders. Although experimental biology has made efforts to discover and elucidate the molecular mechanisms of individual genes in the past decades, still about 40% of human genes have unknown functions, not to mention the diseases they may be related to. For those biologists who are interested in a particular gene with unknown functions, a powerful computational method tailored for inferring the functions and disease relevance of uncharacterized genes is strongly needed. Studies have shown that genes strongly linked to each other in multiple biological networks are more likely to have similar functions. This indicates that the densely connected subgraphs in multiple biological networks are useful in the functional and phenotypic annotation of uncharacterized genes. Therefore, in this work, we have developed an integrative network approach to identify the frequent local clusters, which are defined as those densely connected subgraphs that frequently occur in multiple biological networks and consist of the query gene that has few or no disease or function annotations. This is a local clustering algorithm that models multiple biological networks sharing the same gene set as a three-dimensional matrix, the so-called tensor, and employs the tensor-based optimization method to efficiently find the frequent local clusters. Specifically, massive public gene expression data sets that comprehensively cover dynamic, physiological, and environmental conditions are used to generate hundreds of gene co-expression networks. By integrating these gene co-expression networks, for a given uncharacterized gene that is of biologist’s interest, the proposed method can be applied to identify the frequent local clusters that consist of this uncharacterized gene. Finally, those frequent local clusters are used for function and disease annotation of this uncharacterized gene. This local tensor clustering algorithm outperformed the competing tensor-based algorithm in both module discovery and running time. We also demonstrated the use of the proposed method on real data of hundreds of gene co-expression data and showed that it can comprehensively characterize the query gene. Therefore, this study provides a new tool for annotating the uncharacterized genes and has great potential to assist clinical genomic diagnostics.

Keywords: local tensor clustering, query gene, gene co-expression network, gene annotation

Procedia PDF Downloads 140
6219 Transformation of Industrial Policy towards Industry 4.0 and Its Impact on Firms' Competition

Authors: Arūnas Burinskas

Abstract:

Although Europe is on the threshold of a new industrial revolution called Industry 4.0, many believe that this will increase the flexibility of production, the mass adaptation of products to consumers and the speed of their service; it will also improve product quality and dramatically increase productivity. However, as expected, all the benefits of Industry 4.0 face many of the inevitable changes and challenges they pose. One of them is the inevitable transformation of current competition and business models. This article examines the possible results of competitive conversion from the classic Bertrand and Cournot models to qualitatively new competition based on innovation. Ability to deliver a new product quickly and the possibility to produce the individual design (through flexible and quickly configurable factories) by reducing equipment failures and increasing process automation and control is highly important. This study shows that the ongoing transformation of the competition model is changing the game. This, together with the creation of complex value networks, means huge investments that make it particularly difficult for small and medium-sized enterprises. In addition, the ongoing digitalization of data raises new concerns regarding legal obligations, intellectual property, and security.

Keywords: Bertrand and Cournot Competition, competition model, industry 4.0, industrial organisation, monopolistic competition

Procedia PDF Downloads 126
6218 Collaborative and Experimental Cultures in Virtual Reality Journalism: From the Perspective of Content Creators

Authors: Radwa Mabrook

Abstract:

Virtual Reality (VR) content creation is a complex and an expensive process, which requires multi-disciplinary teams of content creators. Grant schemes from technology companies help media organisations to explore the VR potential in journalism and factual storytelling. Media organisations try to do as much as they can in-house, but they may outsource due to time constraints and skill availability. Journalists, game developers, sound designers and creative artists work together and bring in new cultures of work. This study explores the collaborative experimental nature of VR content creation, through tracing every actor involved in the process and examining their perceptions of the VR work. The study builds on Actor Network Theory (ANT), which decomposes phenomena into their basic elements and traces the interrelations among them. Therefore, the researcher conducted 22 semi-structured interviews with VR content creators between November 2017 and April 2018. Purposive and snowball sampling techniques allowed the researcher to recruit fact-based VR content creators from production studios and media organisations, as well as freelancers. Interviews lasted up to three hours, and they were a mix of Skype calls and in-person interviews. Participants consented for their interviews to be recorded, and for their names to be revealed in the study. The researcher coded interviews’ transcripts in Nvivo software, looking for key themes that correspond with the research questions. The study revealed that VR content creators must be adaptive to change, open to learn and comfortable with mistakes. The VR content creation process is very iterative because VR has no established work flow or visual grammar. Multi-disciplinary VR team members often speak different languages making it hard to communicate. However, adaptive content creators perceive VR work as a fun experience and an opportunity to learn. The traditional sense of competition and the strive for information exclusivity are now replaced by a strong drive for knowledge sharing. VR content creators are open to share their methods of work and their experiences. They target to build a collaborative network that aims to harness VR technology for journalism and factual storytelling. Indeed, VR is instilling collaborative and experimental cultures in journalism.

Keywords: collaborative culture, content creation, experimental culture, virtual reality

Procedia PDF Downloads 112
6217 Extracting Attributes for Twitter Hashtag Communities

Authors: Ashwaq Alsulami, Jianhua Shao

Abstract:

Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.

Keywords: attributed community, attribute detection, community, social network

Procedia PDF Downloads 144
6216 A Mathematical Model for Hepatitis B Virus Infection and the Impact of Vaccination on Its Dynamics

Authors: T. G. Kassem, A. K. Adunchezor, J. P. Chollom

Abstract:

This paper describes a mathematical model developed to predict the dynamics of Hepatitis B virus (HBV) infection and to evaluate the potential impact of vaccination and treatment on its dynamics. We used a compartmental model expressed by a set of differential equations based on the characteristic of HBV transmission. With these, we find the threshold quantity R0, then find the local asymptotic stability of disease free equilibrium and endemic equilibrium. Furthermore, we find the global stability of the disease free and endemic equilibrium.

Keywords: hepatitis B virus, epidemiology, vaccination, mathematical model

Procedia PDF Downloads 306
6215 Transformative Pedagogy and Online Adult Education

Authors: Glenn A. Palmer, Lorenzo Bowman, Juanita Johnson-Bailey

Abstract:

The ubiquitous economic upheaval that has gripped the global environment in the past few years displaced many workers through unemployment or underemployment. Globally, this disruption has caused many adult workers to seek additional education or skills to remain competitive, and acquire the ability and options to find gainful employment. While many learners have availed themselves of some opportunities to be retrained and retooled at locations within their communities, others have explored those options through the online learning environment. This paper examines the empirical research that explores the various strategies that are used in the adult online learning community that could also foster transformative learning.

Keywords: online learning, transformational learning, adult education, economic crisis, unemployment

Procedia PDF Downloads 451
6214 Minimizing Fresh and Wastewater Using Water Pinch Technique in Petrochemical Industries

Authors: Wasif Mughees, Malik Al-Ahmad, Muhammad Naeem

Abstract:

This research involves the design and analysis of pinch-based water/wastewater networks to minimize water utility in the petrochemical and petroleum industries. A study has been done on Tehran Oil Refinery to analyze feasibilities of regeneration, reuse and recycling of water network. COD is considered as a single key contaminant. Amount of freshwater was reduced about 149m3/h (43.8%) regarding COD. Re-design (or retrofitting) of water allocation in the networks was undertaken. The results were analyzed through graphical method and mathematical programming technique which clearly demonstrated that amount of required water would be determined by mass transfer of COD.

Keywords: minimization, water pinch, water management, pollution prevention

Procedia PDF Downloads 431
6213 Production Optimization under Geological Uncertainty Using Distance-Based Clustering

Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe

Abstract:

It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.

Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization

Procedia PDF Downloads 130
6212 Mathematical Modelling and Numerical Simulation of Maisotsenko Cycle

Authors: Rasikh Tariq, Fatima Z. Benarab

Abstract:

Evaporative coolers has a minimum potential to reach the wet-bulb temperature of intake air which is not enough to handle a large cooling load; therefore, it is not a feasible option to overcome cooling requirement of a building. The invention of Maisotsenko (M) cycle has led evaporative cooling technology to reach the sub-wet-bulb temperature of the intake air; therefore, it brings an innovation in evaporative cooling techniques. In this work, we developed a mathematical model of the Maisotsenko based air cooler by applying energy and mass balance laws on different air channels. The governing ordinary differential equations are discretized and simulated on MATLAB. The temperature and the humidity plots are shown in the simulation results. A parametric study is conducted by varying working air inlet conditions (temperature and humidity), inlet air velocity, geometric parameters and water temperature. The influence of these aforementioned parameters on the cooling effectiveness of the HMX is reported.  Results have shown that the effectiveness of the M-Cycle is increased by increasing the ambient temperature and decreasing absolute humidity. An air velocity of 0.5 m/sec and a channel height of 6-8mm is recommended.

Keywords: HMX, maisotsenko cycle, mathematical modeling, numerical simulation, parametric study

Procedia PDF Downloads 135
6211 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)

Authors: Longqing Li

Abstract:

The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.

Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting

Procedia PDF Downloads 306
6210 Application of Fuzzy Clustering on Classification Agile Supply Chain Firms

Authors: Hamidreza Fallah Lajimi, Elham Karami, Alireza Arab, Fatemeh Alinasab

Abstract:

Being responsive is an increasingly important skill for firms in today’s global economy; thus firms must be agile. Naturally, it follows that an organization’s agility depends on its supply chain being agile. However, achieving supply chain agility is a function of other abilities within the organization. This paper analyses results from a survey of 71 Iran manufacturing companies in order to identify some of the factors for agile organizations in managing their supply chains. Then we classification this company in four cluster with fuzzy c-mean technique and with Four validations functional determine automatically the optimal number of clusters.

Keywords: agile supply chain, clustering, fuzzy clustering, business engineering

Procedia PDF Downloads 689
6209 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method

Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek

Abstract:

Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.

Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow

Procedia PDF Downloads 122
6208 A Preliminary Study of the Subcontractor Evaluation System for the International Construction Market

Authors: Hochan Seok, Woosik Jang, Seung-Heon Han

Abstract:

The stagnant global construction market has intensified competition since 2008 among firms that aim to win overseas contracts. Against this backdrop, subcontractor selection is identified as one of the most critical success factors in overseas construction project. However, it is difficult to select qualified subcontractors due to the lack of evaluation standards and reliability. This study aims to identify the problems associated with existing subcontractor evaluations using a correlations analysis and a multiple regression analysis with pre-qualification and performance evaluation of 121 firms in six countries.

Keywords: subcontractor evaluation system, pre-qualification, performance evaluation, correlation analysis, multiple regression analysis

Procedia PDF Downloads 352
6207 An Artificial Neural Network Model Based Study of Seismic Wave

Authors: Hemant Kumar, Nilendu Das

Abstract:

A study based on ANN structure gives us the information to predict the size of the future in realizing a past event. ANN, IMD (Indian meteorological department) data and remote sensing were used to enable a number of parameters for calculating the size that may occur in the future. A threshold selected specifically above the high-frequency harvest reached the area during the selected seismic activity. In the field of human and local biodiversity it remains to obtain the right parameter compared to the frequency of impact. But during the study the assumption is that predicting seismic activity is a difficult process, not because of the parameters involved here, which can be analyzed and funded in research activity.

Keywords: ANN, Bayesion class, earthquakes, IMD

Procedia PDF Downloads 111
6206 Eco-Friendly Electricity Production from the Waste Heat of Air Conditioners

Authors: Anvesh Rajak

Abstract:

This is a new innovation that can be developed. Here I am going to use the waste heat of air conditioner so as to produce the electricity by using the Stirling engine because this waste heat creates the thermal pollution in the environment. The waste heat from the air conditioners has caused a temperature rise of 1°–2°C or more on weekdays in the Tokyo office areas. This heating promotes the heat-island phenomenon in Tokyo on weekdays. Now these air conditioners creates the thermal pollution in the environment and hence rising the temperature of the environment. Air conditioner generally emit the waste heat air whose temperature is about 50°C which heat the environment. Today the demand of energy is increasing tremendously, but available energy lacks in supply. Hence, there is no option for proper and efficient utilization and conservation of energy. In this paper the main stress is given on energy conservation by using technique of utilizing waste heat from Air-conditioning system. Actually the focus is on the use of the waste heat rather than improving the COP of the air- conditioners; if also we improve the COP of air conditioners gradually it would emit some waste heat so I want that waste heat to be used up. As I have used air conditioner’s waste heat to produce electricity so similarly there are various other appliances which emit the waste heat in the surrounding so here also we could use the Stirling engines and Geothermal heat pump concept to produce the electricity and hence can reduce the thermal pollution in the environment.

Keywords: stirling engine, geothermal heat pumps, waste heat, air conditioners

Procedia PDF Downloads 343
6205 Community Engagement: Experience from the SIREN Study in Sub-Saharan Africa

Authors: Arti Singh, Carolyn Jenkins, Oyedunni S. Arulogun, Mayowa O. Owolabi, Fred S. Sarfo, Bruce Ovbiagele, Enzinne Sylvia

Abstract:

Background: Stroke, the leading cause of adult-onset disability and the second leading cause of death, is a major public health concern particularly pertinent in Sub-Saharan Africa (SSA), where nearly 80% of all global stroke mortalities occur. The Stroke Investigative Research and Education Network (SIREN) seeks to comprehensively characterize the genomic, sociocultural, economic, and behavioral risk factors for stroke and to build effective teams for research to address and decrease the burden of stroke and other non communicable diseases in SSA. One of the first steps to address this goal was to effectively engage the communities that suffer the high burden of disease in SSA. This study describes how the SIREN project engaged six sites in Ghana and Nigeria over the past three years, describing the community engagement activities that have arisen since inception. Aim: The aim of community engagement (CE) within SIREN is to elucidate information about knowledge, attitudes, beliefs, and practices (KABP) about stroke and its risk factors from individuals of African ancestry in SSA, and to educate the community about stroke and ways to decrease disabilities and deaths from stroke using socioculturally appropriate messaging and messengers. Methods: Community Advisory Board (CABs), Focus Group Discussions (FGDs) and community outreach programs. Results: 27 FGDs with 168 participants including community heads, religious leaders, health professionals and individuals with stroke among others, were conducted, and over 60 CE outreaches have been conducted within the SIREN performance sites. Over 5,900 individuals have received education on cardiovascular risk factors and about 5,000 have been screened for cardiovascular risk factors during the outreaches. FGDs and outreach programs indicate that knowledge of stroke, as well as risk factors and follow-up evidence-based care is limited and often late. Other findings include: 1) Most recognize hypertension as a major risk factor for stroke. 2) About 50% report that stroke is hereditary and about 20% do not know organs affected by stroke. 3) More than 95% willing to participate in genetic testing research and about 85% willing to pay for testing and recommend the test to others. 4) Almost all indicated that genetic testing could help health providers better treat stroke and help scientists better understand the causes of stroke. The CABs provided stakeholder input into SIREN activities and facilitated collaborations among investigators, community members and stakeholders. Conclusion: The CE core within SIREN is a first-of-its kind public outreach engagement initiative to evaluate and address perceptions about stroke and genomics by patients, caregivers, and local leaders in SSA and has implications as a model for assessment in other high-stroke risk populations. SIREN’s CE program uses best practices to build capacity for community-engaged research, accelerate integration of research findings into practice and strengthen dynamic community-academic partnerships within our communities. CE has had several major successes over the past three years including our multi-site collaboration examining the KABP about stroke (symptoms, risk factors, burden) and genetic testing across SSA.

Keywords: community advisory board, community engagement, focus groups, outreach, SSA, stroke

Procedia PDF Downloads 412
6204 The Study of Thai Millennial Attitude toward End-of-Life Planning, Opportunity of Service Design Development

Authors: Mawong R., Bussracumpakorn C.

Abstract:

Millions of young people around the world have been affected by COVID-19 to their psychological and social effects. Millennials’ stresses have been shaped by a few global issues, including climate change, political instability, and financial crisis. In particular, the spread of COVID-19 has become laying psychological and socioeconomic scars on them. When end-of-life planning turns into more widely discussed, the stigma and taboos around this issue are greatly lessened. End-of-life planning is defined as a future life plan, such as financial, legacy, funeral, and memorial planning. This plan would help millennials to discover the value and meaning of life. This study explores the attitudes of Thai Millennials toward end-of-life planning as a new normal awareness of life in order to initiate an innovative service concept to fit with their value and meaning. The study conducts an in-depth interview with 12 potential participants who have awareness or action on the plan. The framework of the customer journey map is used to analyze the responses to examine trigger points, barriers, beliefs, and expectations. The findings pointed to a service concept that is suggested for a new end-of-life planning service that is suited to Thai Millennials in 4 different groups, which are 1. Social -Conscious as a socially aware who to donate time and riches to make the world and society a better place, their end-of-life planning value is inspired by the social impact of giving something or some action that they will be able to do after life or during life which provides a variety of choice based on their preference to give to society, 2. Life Fulfillment who make a life goal for themselves and attempt to achieve it before the time comes to their value will be to inspire life value with a customized plan and provide guidance to suggest, 3. Prevention of the After-Death Effect who want to plan to avoid the effects of their death as patriarch, head of the family, and anchor of someone, so they want to have a plan that brings confidence and feel relief while they are still alive and they want to find some reliable service that they can leave the death will or asset, and 4. No Guilty Planning who plan for when they wish to be worry-free as a self-responsible they want to have the plan which is easy to understand and easy to access. The overall finding of the study is to understand the new service concept of end-of-life planning which to improve knowledge of significant life worth rather than death planning, encouraging people to reassess their lives in a positive way, leading to higher self-esteem and intrinsic motivation for this generation in this time of global crisis.

Keywords: design management, end-of-life planning, millennial generation, service design solution

Procedia PDF Downloads 174
6203 Results of Three-Year Operation of 220kV Pilot Superconducting Fault Current Limiter in Moscow Power Grid

Authors: M. Moyzykh, I. Klichuk, L. Sabirov, D. Kolomentseva, E. Magommedov

Abstract:

Modern city electrical grids are forced to increase their density due to the increasing number of customers and requirements for reliability and resiliency. However, progress in this direction is often limited by the capabilities of existing network equipment. New energy sources or grid connections increase the level of short-circuit currents in the adjacent network, which can exceed the maximum rating of equipment–breaking capacity of circuit breakers, thermal and dynamic current withstand qualities of disconnectors, cables, and transformers. Superconducting fault current limiter (SFCL) is a modern solution designed to deal with the increasing fault current levels in power grids. The key feature of this device is its instant (less than 2 ms) limitation of the current level due to the nature of the superconductor. In 2019 Moscow utilities installed SuperOx SFCL in the city power grid to test the capabilities of this novel technology. The SFCL became the first SFCL in the Russian energy system and is currently the most powerful SFCL in the world. Modern SFCL uses second-generation high-temperature superconductor (2G HTS). Despite its name, HTS still requires low temperatures of liquid nitrogen for operation. As a result, Moscow SFCL is built with a cryogenic system to provide cooling to the superconductor. The cryogenic system consists of three cryostats that contain a superconductor part and are filled with liquid nitrogen (three phases), three cryocoolers, one water chiller, three cryopumps, and pressure builders. All these components are controlled by an automatic control system. SFCL has been continuously operating on the city grid for over three years. During that period of operation, numerous faults occurred, including cryocooler failure, chiller failure, pump failure, and others (like a cryogenic system power outage). All these faults were eliminated without an SFCL shut down due to the specially designed cryogenic system backups and quick responses of grid operator utilities and the SuperOx crew. The paper will describe in detail the results of SFCL operation and cryogenic system maintenance and what measures were taken to solve and prevent similar faults in the future.

Keywords: superconductivity, current limiter, SFCL, HTS, utilities, cryogenics

Procedia PDF Downloads 69