Search results for: product circularity index
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7181

Search results for: product circularity index

131 Web-Based Decision Support Systems and Intelligent Decision-Making: A Systematic Analysis

Authors: Serhat Tüzün, Tufan Demirel

Abstract:

Decision Support Systems (DSS) have been investigated by researchers and technologists for more than 35 years. This paper analyses the developments in the architecture and software of these systems, provides a systematic analysis for different Web-based DSS approaches and Intelligent Decision-making Technologies (IDT), with the suggestion for future studies. Decision Support Systems literature begins with building model-oriented DSS in the late 1960s, theory developments in the 1970s, and the implementation of financial planning systems and Group DSS in the early and mid-80s. Then it documents the origins of Executive Information Systems, online analytic processing (OLAP) and Business Intelligence. The implementation of Web-based DSS occurred in the mid-1990s. With the beginning of the new millennia, intelligence is the main focus on DSS studies. Web-based technologies are having a major impact on design, development and implementation processes for all types of DSS. Web technologies are being utilized for the development of DSS tools by leading developers of decision support technologies. Major companies are encouraging its customers to port their DSS applications, such as data mining, customer relationship management (CRM) and OLAP systems, to a web-based environment. Similarly, real-time data fed from manufacturing plants are now helping floor managers make decisions regarding production adjustment to ensure that high-quality products are produced and delivered. Web-based DSS are being employed by organizations as decision aids for employees as well as customers. A common usage of Web-based DSS has been to assist customers configure product and service according to their needs. These systems allow individual customers to design their own products by choosing from a menu of attributes, components, prices and delivery options. The Intelligent Decision-making Technologies (IDT) domain is a fast growing area of research that integrates various aspects of computer science and information systems. This includes intelligent systems, intelligent technology, intelligent agents, artificial intelligence, fuzzy logic, neural networks, machine learning, knowledge discovery, computational intelligence, data science, big data analytics, inference engines, recommender systems or engines, and a variety of related disciplines. Innovative applications that emerge using IDT often have a significant impact on decision-making processes in government, industry, business, and academia in general. This is particularly pronounced in finance, accounting, healthcare, computer networks, real-time safety monitoring and crisis response systems. Similarly, IDT is commonly used in military decision-making systems, security, marketing, stock market prediction, and robotics. Even though lots of research studies have been conducted on Decision Support Systems, a systematic analysis on the subject is still missing. Because of this necessity, this paper has been prepared to search recent articles about the DSS. The literature has been deeply reviewed and by classifying previous studies according to their preferences, taxonomy for DSS has been prepared. With the aid of the taxonomic review and the recent developments over the subject, this study aims to analyze the future trends in decision support systems.

Keywords: decision support systems, intelligent decision-making, systematic analysis, taxonomic review

Procedia PDF Downloads 279
130 Entrepreneurial Venture Creation through Anchor Event Activities: Pop-Up Stores as On-Site Arenas

Authors: Birgit A. A. Solem, Kristin Bentsen

Abstract:

Scholarly attention in entrepreneurship is currently directed towards understanding entrepreneurial venture creation as a process -the journey of new economic activities from nonexistence to existence often studied through flow- or network models. To complement existing research on entrepreneurial venture creation with more interactivity-based research of organized activities, this study examines two pop-up stores as anchor events involving on-site activities of fifteen participating entrepreneurs launching their new ventures. The pop-up stores were arranged in two middle-sized Norwegian cities and contained different brand stores that brought together actors of sub-networks and communities executing venture creation activities. The pop-up stores became on-site arenas for the entrepreneurs to create, maintain, and rejuvenate their networks, at the same time as becoming venues for temporal coordination of activities involving existing and potential customers in their venture creation. In this work, we apply a conceptual framework based on frequently addressed dilemmas within entrepreneurship theory (discovery/creation, causation/effectuation) to further shed light on the broad aspect of on-site anchor event activities and their venture creation outcomes. The dilemma-based concepts are applied as an analytic toolkit to pursue answers regarding the nature of anchor event activities typically found within entrepreneurial venture creation and how these anchor event activities affect entrepreneurial venture creation outcomes. Our study combines researcher participation with 200 hours of observation and twenty in-depth interviews. Data analysis followed established guidelines for hermeneutic analysis and was intimately intertwined with ongoing data collection. Data was coded and categorized in NVivo 12 software, and iterated several times as patterns were steadily developing. Our findings suggest that core anchor event activities typically found within entrepreneurial venture creation are; a concept- and product experimentation with visitors, arrangements to socialize (evening specials, auctions, and exhibitions), store-in-store concepts, arranged meeting places for peers and close connection with municipality and property owners. Further, this work points to four main entrepreneurial venture creation outcomes derived from the core anchor event activities; (1) venture attention, (2) venture idea-realization, (3) venture collaboration, and (4) venture extension. Our findings show that, depending on which anchor event activities are applied, the outcomes vary. Theoretically, this study offers two main implications. First, anchor event activities are both discovered and created, following the logic of causation, at the same time as being experimental, based on “learning by doing” principles of effectuation during the execution. Second, our research enriches prior studies on venture creation as a process. In this work, entrepreneurial venture creation activities and outcomes are understood through pop-up stores as on-site anchor event arenas, particularly suitable for interactivity-based research requested by the entrepreneurship field. This study also reveals important managerial implications, such as that entrepreneurs should allow themselves to find creative physical venture creation arenas (e.g., pop-up stores, showrooms), as well as collaborate with partners when discovering and creating concepts and activities based on new ideas. In this way, they allow themselves to both strategically plan for- and continually experiment with their venture.

Keywords: anchor event, interactivity-based research, pop-up store, entrepreneurial venture creation

Procedia PDF Downloads 91
129 Audience Members' Perspective-Taking Predicts Accurate Identification of Musically Expressed Emotion in a Live Improvised Jazz Performance

Authors: Omer Leshem, Michael F. Schober

Abstract:

This paper introduces a new method for assessing how audience members and performers feel and think during live concerts, and how audience members' recognized and felt emotions are related. Two hypotheses were tested in a live concert setting: (1) that audience members’ cognitive perspective taking ability predicts their accuracy in identifying an emotion that a jazz improviser intended to express during a performance, and (2) that audience members' affective empathy predicts their likelihood of feeling the same emotions as the performer. The aim was to stage a concert with audience members who regularly attend live jazz performances, and to measure their cognitive and affective reactions during the performance as non-intrusively as possible. Pianist and Grammy nominee Andy Milne agreed, without knowing details of the method or hypotheses, to perform a full-length solo improvised concert that would include an ‘unusual’ piece. Jazz fans were recruited through typical advertising for New York City jazz performances. The event was held at the New School’s Glass Box Theater, the home of leading NYC jazz venue ‘The Stone.’ Audience members were charged typical NYC jazz club admission prices; advertisements informed them that anyone who chose to participate in the study would be reimbursed their ticket price after the concert. The concert, held in April 2018, had 30 attendees, 23 of whom participated in the study. Twenty-two minutes into the concert, the performer was handed a paper note with the instruction: ‘Perform a 3-5-minute improvised piece with the intention of conveying sadness.’ (Sadness was chosen based on previous music cognition lab studies, where solo listeners were less likely to select sadness as the musically-expressed emotion accurately from a list of basic emotions, and more likely to misinterpret sadness as tenderness). Then, audience members and the performer were invited to respond to a questionnaire from a first envelope under their seat. Participants used their own words to describe the emotion the performer had intended to express, and then to select the intended emotion from a list. They also reported the emotions they had felt while listening using Izard’s differential emotions scale. The concert then continued as usual. At the end, participants answered demographic questions and Davis’ interpersonal reactivity index (IRI), a 28-item scale designed to assess both cognitive and affective empathy. Hypothesis 1 was supported: audience members with greater cognitive empathy were more likely to accurately identify sadness as the expressed emotion. Moreover, audience members who accurately selected ‘sadness’ reported feeling marginally sadder than people who did not select sadness. Hypotheses 2 was not supported; audience members with greater affective empathy were not more likely to feel the same emotions as the performer. If anything, members with lower cognitive perspective-taking ability had marginally greater emotional overlap with the performer, which makes sense given that these participants were less likely to identify the music as sad, which corresponded with the performer’s actual feelings. Results replicate findings from solo lab studies in a concert setting and demonstrate the viability of exploring empathy and collective cognition in improvised live performance.

Keywords: audience, cognition, collective cognition, emotion, empathy, expressed emotion, felt emotion, improvisation, live performance, recognized emotion

Procedia PDF Downloads 132
128 Influence of the Local External Pressure on Measured Parameters of Cutaneous Microcirculation

Authors: Irina Mizeva, Elena Potapova, Viktor Dremin, Mikhail Mezentsev, Valeri Shupletsov

Abstract:

The local tissue perfusion is regulated by the microvascular tone which is under the control of a number of physiological mechanisms. Laser Doppler flowmetry (LDF) together with wavelet analyses is the most commonly used technique to study the regulatory mechanisms of cutaneous microcirculation. External factors such as temperature, local pressure of the probe on the skin, etc. influence on the blood flow characteristics and are used as physiological tests to evaluate microvascular regulatory mechanisms. Local probe pressure influences on the microcirculation parameters measured by optical methods: diffuse reflectance spectroscopy, fluorescence spectroscopy, and LDF. Therefore, further study of probe pressure effects can be useful to improve the reliability of optical measurement. During pressure tests variation of the mean perfusion measured by means of LDF usually is estimated. An additional information concerning the physiological mechanisms of the vascular tone regulation system in response to local pressure can be obtained using spectral analyses of LDF samples. The aim of the present work was to develop protocol and algorithm of data processing appropriate for study physiological response to the local pressure test. Involving 6 subjects (20±2 years) and providing 5 measurements for every subject we estimated intersubject and-inter group variability of response of both averaged and oscillating parts of the LDF sample on external surface pressure. The final purpose of the work was to find special features which further can be used in wider clinic studies. The cutaneous perfusion measurements were carried out by LAKK-02 (SPE LAZMA Ltd., Russia), the skin loading was provided by the originally designed device which allows one to distribute the pressure around the LDF probe. The probe was installed on the dorsal part of the distal finger of the index figure. We collected measurements continuously for one hour and varied loading from 0 to 180mmHg stepwise with a step duration of 10 minutes. Further, we post-processed the samples using the wavelet transform and traced the energy of oscillations in five frequency bands over time. Weak loading leads to pressure-induced vasodilation, so one should take into account that the perfusion measured under pressure conditions will be overestimated. On the other hand, we revealed a decrease in endothelial associated fluctuations. Further loading (88 mmHg) induces amplification of pulsations in all frequency bands. We assume that such loading leads to a higher number of closed capillaries, higher input of arterioles in the LDF signal and as a consequence more vivid oscillations which mainly are formed in arterioles. External pressure higher than 144 mmHg leads to the decrease of oscillating components, after removing the loading very rapid restore of the tissue perfusion takes place. In this work, we have demonstrated that local skin loading influence on the microcirculation parameters measured by optic technique; this should be taken into account while developing portable electronic devices. The proposed protocol of local loading allows one to evaluate PIV as far as to trace dynamic of blood flow oscillations. This study was supported by the Russian Science Foundation under project N 18-15-00201.

Keywords: blood microcirculation, laser Doppler flowmetry, pressure-induced vasodilation, wavelet analyses blood

Procedia PDF Downloads 150
127 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 235
126 Numerical Simulation of the Production of Ceramic Pigments Using Microwave Radiation: An Energy Efficiency Study Towards the Decarbonization of the Pigment Sector

Authors: Pedro A. V. Ramos, Duarte M. S. Albuquerque, José C. F. Pereira

Abstract:

Global warming mitigation is one of the main challenges of this century, having the net balance of greenhouse gas (GHG) emissions to be null or negative in 2050. Industry electrification is one of the main paths to achieving carbon neutrality within the goals of the Paris Agreement. Microwave heating is becoming a popular industrial heating mechanism due to the absence of direct GHG emissions, but also the rapid, volumetric, and efficient heating. In the present study, a mathematical model is used to simulate the production using microwave heating of two ceramic pigments, at high temperatures (above 1200 Celsius degrees). The two pigments studied were the yellow (Pr, Zr)SiO₂ and the brown (Ti, Sb, Cr)O₂. The chemical conversion of reactants into products was included in the model by using the kinetic triplet obtained with the model-fitting method and experimental data present in the Literature. The coupling between the electromagnetic, thermal, and chemical interfaces was also included. The simulations were computed in COMSOL Multiphysics. The geometry includes a moving plunger to allow for the cavity impedance matching and thus maximize the electromagnetic efficiency. To accomplish this goal, a MATLAB controller was developed to automatically search the position of the moving plunger that guarantees the maximum efficiency. The power is automatically and permanently adjusted during the transient simulation to impose stationary regime and total conversion, the two requisites of every converged solution. Both 2D and 3D geometries were used and a parametric study regarding the axial bed velocity and the heat transfer coefficient at the boundaries was performed. Moreover, a Verification and Validation study was carried out by comparing the conversion profiles obtained numerically with the experimental data available in the Literature; the numerical uncertainty was also estimated to attest to the result's reliability. The results show that the model-fitting method employed in this work is a suitable tool to predict the chemical conversion of reactants into the pigment, showing excellent agreement between the numerical results and the experimental data. Moreover, it was demonstrated that higher velocities lead to higher thermal efficiencies and thus lower energy consumption during the process. This work concludes that the electromagnetic heating of materials having high loss tangent and low thermal conductivity, like ceramic materials, maybe a challenge due to the presence of hot spots, which may jeopardize the product quality or even the experimental apparatus. The MATLAB controller increased the electromagnetic efficiency by 25% and global efficiency of 54% was obtained for the titanate brown pigment. This work shows that electromagnetic heating will be a key technology in the decarbonization of the ceramic sector as reductions up to 98% in the specific GHG emissions were obtained when compared to the conventional process. Furthermore, numerical simulations appear as a suitable technique to be used in the design and optimization of microwave applicators, showing high agreement with experimental data.

Keywords: automatic impedance matching, ceramic pigments, efficiency maximization, high-temperature microwave heating, input power control, numerical simulation

Procedia PDF Downloads 138
125 Nephrotoxicity and Hepatotoxicity Induced by Chronic Aluminium Exposure in Rats: Impact of Nutrients Combination versus Social Isolation and Protein Malnutrition

Authors: Azza A. Ali, Doaa M. Abd El-Latif, Amany M. Gad, Yasser M. A. Elnahas, Karema Abu-Elfotuh

Abstract:

Background: Exposure to Aluminium (Al) has been increased recently. It is found in food products, food additives, drinking water, cosmetics and medicines. Chronic consumption of Al causes oxidative stress and has been implicated in several chronic disorders. Liver is considered as the major site for detoxification while kidney is involved in the elimination of toxic substances and is a target organ of metal toxicity. Social isolation (SI) or protein malnutrition (PM) also causes oxidative stress and has negative impact on Al-induced nephrotoxicity as well as hepatotoxicity. Coenzyme Q10 (CoQ10) is a powerful intracellular antioxidant with mitochondrial membrane stabilizing ability while wheat grass is a natural product with antioxidant, anti-inflammatory and different protective activities, cocoa is also potent antioxidants and can protect against many diseases. They provide different degrees of protection from the impact of oxidative stress. Objective: To study the impact of social isolation together with Protein malnutrition on nephro- and hepato-toxicity induced by chronic Al exposure in rats as well as to investigate the postulated protection using a combination of Co Q10, wheat grass and cocoa. Methods: Eight groups of rats were used; four served as protected groups and four as un-protected. Each of them received daily for five weeks AlCl3 (70 mg/kg, IP) for Al-toxicity model groups except one group served as control. Al-toxicity model groups were divided to Al-toxicity alone, SI- associated PM (10% casein diet) and Al- associated SI&PM groups. Protection was induced by oral co-administration of CoQ10 (200mg/kg), wheat grass (100mg/kg) and cocoa powder (24mg/kg) combination together with Al. Biochemical changes in total bilirubin, lipids, cholesterol, triglycerides, glucose, proteins, creatinine and urea as well as alanine aminotransferase (ALT), aspartate aminotransferase (AST), alkaline phosphatase (ALP), lactate deshydrogenase (LDH) were measured in serum of all groups. Specimens of kidney and liver were used for assessment of oxidative parameters (MDA, SOD, TAC, NO), inflammatory mediators (TNF-α, IL-6β, nuclear factor kappa B (NF-κB), Caspase-3) and DNA fragmentation in addition to evaluation of histopathological changes. Results: SI together with PM severely enhanced nephro- and hepato-toxicity induced by chronic Al exposure. Co Q10, wheat grass and cocoa combination showed clear protection against hazards of Al exposure either alone or when associated with SI&PM. Their protection were indicated by the significant decrease in Al-induced elevations in total bilirubin, lipids, cholesterol, triglycerides, glucose, creatinine and urea levels as well as ALT, AST, ALP, LDH. Liver and kidney of the treated groups also showed significant decrease in MDA, NO, TNF-α, IL-6β, NF-κB, caspase-3 and DNA fragmentation, together with significant increase in total proteins, SOD and TAC. Biochemical results were confirmed by the histopathological examinations. Conclusion: SI together with PM represents a risk factor in enhancing nephro- and hepato-toxicity induced by Al in rats. CoQ10, wheat grass and cocoa combination provide clear protection against nephro- and hepatotoxicity as well as the consequent degenerations induced by chronic Al-exposure even when associated with the risk of SI together with PM.

Keywords: aluminum, nephrotoxicity, hepatotoxicity, isolation and protein malnutrition, coenzyme Q10, wheatgrass, cocoa, nutrients combinations

Procedia PDF Downloads 248
124 qPCR Method for Detection of Halal Food Adulteration

Authors: Gabriela Borilova, Monika Petrakova, Petr Kralik

Abstract:

Nowadays, European producers are increasingly interested in the production of halal meat products. Halal meat has been increasingly appearing in the EU's market network and meat products from European producers are being exported to Islamic countries. Halal criteria are mainly related to the origin of muscle used in production, and also to the way products are obtained and processed. Although the EU has legislatively addressed the question of food authenticity, the circumstances of previous years when products with undeclared horse or poultry meat content appeared on EU markets raised the question of the effectiveness of control mechanisms. Replacement of expensive or not-available types of meat for low-priced meat has been on a global scale for a long time. Likewise, halal products may be contaminated (falsified) by pork or food components obtained from pigs. These components include collagen, offal, pork fat, mechanically separated pork, emulsifier, blood, dried blood, dried blood plasma, gelatin, and others. These substances can influence sensory properties of the meat products - color, aroma, flavor, consistency and texture or they are added for preservation and stabilization. Food manufacturers sometimes access these substances mainly due to their dense availability and low prices. However, the use of these substances is not always declared on the product packaging. Verification of the presence of declared ingredients, including the detection of undeclared ingredients, are among the basic control procedures for determining the authenticity of food. Molecular biology methods, based on DNA analysis, offer rapid and sensitive testing. The PCR method and its modification can be successfully used to identify animal species in single- and multi-ingredient raw and processed foods and qPCR is the first choice for food analysis. Like all PCR-based methods, it is simple to implement and its greatest advantage is the absence of post-PCR visualization by electrophoresis. qPCR allows detection of trace amounts of nucleic acids, and by comparing an unknown sample with a calibration curve, it can also provide information on the absolute quantity of individual components in the sample. Our study addresses a problem that is related to the fact that the molecular biological approach of most of the work associated with the identification and quantification of animal species is based on the construction of specific primers amplifying the selected section of the mitochondrial genome. In addition, the sections amplified in conventional PCR are relatively long (hundreds of bp) and unsuitable for use in qPCR, because in DNA fragmentation, amplification of long target sequences is quite limited. Our study focuses on finding a suitable genomic DNA target and optimizing qPCR to reduce variability and distortion of results, which is necessary for the correct interpretation of quantification results. In halal products, the impact of falsification of meat products by the addition of components derived from pigs is all the greater that it is not just about the economic aspect but above all about the religious and social aspect. This work was supported by the Ministry of Agriculture of the Czech Republic (QJ1530107).

Keywords: food fraud, halal food, pork, qPCR

Procedia PDF Downloads 247
123 Coming Closer to Communities of Practice through Situated Learning: The Case Study of Polish-English, English-Polish Undergraduate BA Level Language for Specific Purposes of Translation Class

Authors: Marta Lisowska

Abstract:

The growing trend of market specialization imposes upon translators the need for proficiency in the working knowledge of specialist discourse. The notion of specialization differs from a broad general category to a highly specialized narrow field. The specialised discourse is used in the channel of communication based upon distinctive features typical for communities of practice whose co-existence is codified and hermetically locked against outsiders. Consequently, any translator deprived of professional discourse competence and social skills is incapable of providing competent translation product from source language into target language. In this paper, we report on research that explores the pedagogical practices aiming to bridge the dichotomy between the professionals and the specialist translators, while accounting for the reality of the world of professional communities entered by undergraduates on two levels: the text-based generic, and the social one. Drawing from the functional social constructivist approach, seen here as situated learning, this paper reports on the case of English-Polish, Polish-English undergraduate BA Level LSP of law translation class run in line with the simulated classroom-based and the reality-based (apprenticeship) approach. This blended method serves the purpose of introducing the young trainees to the professional world. The research provides new insights into how the LSP translation undergraduates become legitimized through discursive and social participation and engagement. The undergraduates, situated peripherally at the outset, experience their own transformation towards becoming members of these professional groups. With subjective evaluation, the trainees take a stance on this dual mode class and development of their skills. Comparing and contrasting their own work done in line with two models of translation teaching: authentic and near-authentic, the undergraduates answer research questions devised by a questionnaire survey The responses take us closer to how students feel about their LSP translation competence development. The major findings show how the trainees perceive the benefits and hardships of their functional translation class. In terms of skills, they related to communication as the most enhanced one; they highly valued the fact of being ‘exposed’ to a variety of texts (cf. multi literalism), team work, learning how to schedule work, IT skills boost and the ability to learn how to work individually. Another finding indicates that students struggled most with specialized language, and co-working with other students. The short-term research shows the momentum when the undergraduate LSP translation trainees entered the path of transformation i.e. gained consciousness of ‘how it is’ to be a participant-translator of real-life communities of practice, gaining pragmatic dint of the social and linguistic skills understood here as discursive competence (text > genre > discourse > professional practice). The undergraduates need to be aware of the work they have to do and challenges they are to face before arriving at the expert level of professional translation competence.

Keywords: communities of practice in LSP translation teaching, learning LSP translation as situated experience, peripheral participation, professional discourse for LSP translation teaching, professional translation competence

Procedia PDF Downloads 95
122 Enhancing Seismic Resilience in Colombia's Informal Housing: A Low-cost Retrofit Strategy with Buckling-restrained Braces to Protect Vulnerable Communities in Earthquake-prone Regions

Authors: Luis F. Caballero-castro, Dirsa Feliciano, Daniela Novoa, Orlando Arroyo, Jesús D. Villalba-morales

Abstract:

Colombia faces a critical challenge in seismic resilience due to the prevalence of informal housing, which constitutes approximately 70% of residential structures. More than 10 million Colombians (20% of the population), live in homes susceptible to collapse in the event of an earthquake. This, combined with the fact that 83% of the population is in intermediate and high seismic hazard areas, has brought serious consequences to the country. These consequences became evident during the 1999 Armenia earthquake, which affected nearly 100,000 properties and represented economic losses equivalent to 1.88% of that year's Gross Domestic Product (GDP). Despite previous efforts to reinforce informal housing through methods like externally reinforced masonry walls, alternatives related to seismic protection systems (SPDs), such as Buckling-Restrained Braces (BRB), have not yet been explored in the country. BRBs are reinforcement elements capable of withstanding both compression and tension, making them effective in enhancing the lateral stiffness of structures. In this study, the use of low-cost and easily installable BRBs for the retrofit of informal housing in Colombia was evaluated, considering the economic limitations of the communities. For this purpose, a case study was selected involving an informally constructed dwelling in the country, from which field information on its structural characteristics and construction materials was collected. Based on the gathered information, nonlinear models with and without BRBs were created, and their seismic performance was analyzed and compared through incremental static (pushover) and nonlinear dynamic analyses. In the first analysis, the capacity curve was identified, showcasing the sequence of failure events occurring from initial yielding to structural collapse. In the second case, the model underwent nonlinear dynamic analyses using a set of seismic records consistent with the country's seismic hazard. Based on the results, fragility curves were calculated to evaluate the probability of failure of the informal housings before and after the intervention with BRBs, providing essential information about their effectiveness in reducing seismic vulnerability. The results indicate that low-cost BRBs can significantly increase the capacity of informal housing to withstand earthquakes. The dynamic analysis revealed that retrofit structures experienced lower displacements and deformations, enhancing the safety of residents and the seismic performance of informally constructed houses. In other words, the use of low-cost BRBs in the retrofit of informal housing in Colombia is a promising strategy for improving structural safety in seismic-prone areas. This study emphasizes the importance of seeking affordable and practical solutions to address seismic risk in vulnerable communities in earthquake-prone regions in Colombia and serves as a model for addressing similar challenges of informal housing worldwide.

Keywords: buckling-restrained braces, fragility curves, informal housing, incremental dynamic analysis, seismic retrofit

Procedia PDF Downloads 96
121 Financial Policies in the Process of Global Crisis: Case Study Kosovo, Case Kosovo

Authors: Shpetim Rezniqi

Abstract:

Financial Policies in the process of global crisis the current crisis has swept the world with special emphasis, most developed countries, those countries which have most gross -product world and you have a high level of living.Even those who are not experts can describe the consequences of the crisis to see the reality that is seen, but how far will it go this crisis is impossible to predict. Even the biggest experts have conjecture and large divergence, but agree on one thing: - The devastating effects of this crisis will be more severe than ever before and can not be predicted.Long time, the world was dominated economic theory of free market laws. With the belief that the market is the regulator of all economic problems. The market, as river water will flow to find the best and will find the necessary solution best. Therefore much less state market barriers, less state intervention and market itself is an economic self-regulation. Free market economy became the model of global economic development and progress, it transcends national barriers and became the law of the development of the entire world economy. Globalization and global market freedom were principles of development and international cooperation. All international organizations like the World Bank, states powerful economic, development and cooperation principles laid free market economy and the elimination of state intervention. The less state intervention much more freedom of action was this market- leading international principle. We live in an era of financial tragic. Financial markets and banking in particular economies are in a state of thy good, US stock markets fell about 40%, in other words, this time, was one of the darkest moments 5 since 1920. Prior to her rank can only "collapse" of the stock of Wall Street in 1929, technological collapse of 2000, the crisis of 1973 after the Yom Kippur war, while the price of oil quadrupled and famous collapse of 1937 / '38, when Europe was beginning World war II In 2000, even though it seems like the end of the world was the corner, the world economy survived almost intact. Of course, that was small recessions in the United States, Europe, or Japan. Much more difficult the situation was at crisis 30s, or 70s, however, succeeded the world. Regarding the recent financial crisis, it has all the signs to be much sharper and with more consequences. The decline in stock prices is more a byproduct of what is really happening. Financial markets began dance of death with the credit crisis, which came as a result of the large increase in real estate prices and household debt. It is these last two phenomena can be matched very well with the gains of the '20s, a period during which people spent fists as if there was no tomorrow. All is not away from the mouth of the word recession, that fact no longer a sudden and abrupt. But as much as the financial markets melt, the greater is the risk of a problematic economy for years to come. Thus, for example, the banking crisis in Japan proved to be much more severe than initially expected, partly because the assets which were based more loans had, especially the land that falling in value. The price of land in Japan is about 15 years that continues to fall. (ADRI Nurellari-Published in the newspaper "Classifieds"). At this moment, it is still difficult to çmosh to what extent the crisis has affected the economy and what would be the consequences of the crisis. What we know is that many banks will need more time to reduce the award of credit, but banks have this primary function, this means huge loss.

Keywords: globalisation, finance, crisis, recomandation, bank, credits

Procedia PDF Downloads 389
120 Use of Artificial Intelligence and Two Object-Oriented Approaches (k-NN and SVM) for the Detection and Characterization of Wetlands in the Centre-Val de Loire Region, France

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

Nowadays, wetlands are the subject of contradictory debates opposing scientific, political and administrative meanings. Indeed, given their multiple services (drinking water, irrigation, hydrological regulation, mineral, plant and animal resources...), wetlands concentrate many socio-economic and biodiversity issues. In some regions, they can cover vast areas (>100 thousand ha) of the landscape, such as the Camargue area in the south of France, inside the Rhone delta. The high biological productivity of wetlands, the strong natural selection pressures and the diversity of aquatic environments have produced many species of plants and animals that are found nowhere else. These environments are tremendous carbon sinks and biodiversity reserves depending on their age, composition and surrounding environmental conditions, wetlands play an important role in global climate projections. Covering more than 3% of the earth's surface, wetlands have experienced since the beginning of the 1990s a tremendous revival of interest, which has resulted in the multiplication of inventories, scientific studies and management experiments. The geographical and physical characteristics of the wetlands of the central region conceal a large number of natural habitats that harbour a great biological diversity. These wetlands, one of the natural habitats, are still influenced by human activities, especially agriculture, which affects its layout and functioning. In this perspective, decision-makers need to delimit spatial objects (natural habitats) in a certain way to be able to take action. Thus, wetlands are no exception to this rule even if it seems to be a difficult exercise to delimit a type of environment as whose main characteristic is often to occupy the transition between aquatic and terrestrial environment. However, it is possible to map wetlands with databases, derived from the interpretation of photos and satellite images, such as the European database Corine Land cover, which allows quantifying and characterizing for each place the characteristic wetland types. Scientific studies have shown limitations when using high spatial resolution images (SPOT, Landsat, ASTER) for the identification and characterization of small wetlands (1 hectare). To address this limitation, it is important to note that these wetlands generally represent spatially complex features. Indeed, the use of very high spatial resolution images (>3m) is necessary to map small and large areas. However, with the recent evolution of artificial intelligence (AI) and deep learning methods for satellite image processing have shown a much better performance compared to traditional processing based only on pixel structures. Our research work is also based on spectral and textural analysis on THR images (Spot and IRC orthoimage) using two object-oriented approaches, the nearest neighbour approach (k-NN) and the Super Vector Machine approach (SVM). The k-NN approach gave good results for the delineation of wetlands (wet marshes and moors, ponds, artificial wetlands water body edges, ponds, mountain wetlands, river edges and brackish marshes) with a kappa index higher than 85%.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 72
119 Degradation of Diclofenac in Water Using FeO-Based Catalytic Ozonation in a Modified Flotation Cell

Authors: Miguel A. Figueroa, José A. Lara-Ramos, Miguel A. Mueses

Abstract:

Pharmaceutical residues are a section of emerging contaminants of anthropogenic origin that are present in a myriad of waters with which human beings interact daily and are starting to affect the ecosystem directly. Conventional waste-water treatment systems are not capable of degrading these pharmaceutical effluents because their designs cannot handle the intermediate products and biological effects occurring during its treatment. That is why it is necessary to hybridize conventional waste-water systems with non-conventional processes. In the specific case of an ozonation process, its efficiency highly depends on a perfect dispersion of ozone, long times of interaction of the gas-liquid phases and the size of the ozone bubbles formed through-out the reaction system. In order to increase the efficiency of these parameters, the use of a modified flotation cell has been proposed recently as a reactive system, which is used at an industrial level to facilitate the suspension of particles and spreading gas bubbles through the reactor volume at a high rate. The objective of the present work is the development of a mathematical model that can closely predict the kinetic rates of reactions taking place in the flotation cell at an experimental scale by means of identifying proper reaction mechanisms that take into account the modified chemical and hydrodynamic factors in the FeO-catalyzed Ozonation of Diclofenac aqueous solutions in a flotation cell. The methodology is comprised of three steps: an experimental phase where a modified flotation cell reactor is used to analyze the effects of ozone concentration and loading catalyst over the degradation of Diclofenac aqueous solutions. The performance is evaluated through an index of utilized ozone, which relates the amount of ozone supplied to the system per milligram of degraded pollutant. Next, a theoretical phase where the reaction mechanisms taking place during the experiments must be identified and proposed that details the multiple direct and indirect reactions the system goes through. Finally, a kinetic model is obtained that can mathematically represent the reaction mechanisms with adjustable parameters that can be fitted to the experimental results and give the model a proper physical meaning. The expected results are a robust reaction rate law that can simulate the improved results of Diclofenac mineralization on water using the modified flotation cell reactor. By means of this methodology, the following results were obtained: A robust reaction pathways mechanism showcasing the intermediates, free-radicals and products of the reaction, Optimal values of reaction rate constants that simulated Hatta numbers lower than 3 for the system modeled, degradation percentages of 100%, TOC (Total organic carbon) removal percentage of 69.9 only requiring an optimal value of FeO catalyst of 0.3 g/L. These results showed that a flotation cell could be used as a reactor in ozonation, catalytic ozonation and photocatalytic ozonation processes, since it produces high reaction rate constants and reduces mass transfer limitations (Ha > 3) by producing microbubbles and maintaining a good catalyst distribution.

Keywords: advanced oxidation technologies, iron oxide, emergent contaminants, AOTS intensification

Procedia PDF Downloads 112
118 Role of Toll Like Receptor-2 in Female Genital Tuberculosis Disease Infection and Its Severity

Authors: Swati Gautam, Salman Akhtar, S. P. Jaiswar, Amita Jain

Abstract:

Background: FGTB is now a major global health problem mostly in developing countries including India. In humans, Mycobacterium Tuberculosis (M.tb) is a causating agent of infection. High index of suspicion is required for early diagnosis due to asymptomatic presentation of FGTB disease. In macrophages Toll Like Receptor-2 (TLR-2) is one which mediated host’s immune response to M.tb. The expression of TLR-2 on macrophages is important to determine the fate of innate immune responses to M.tb. TLR-2 have two work. First its high expression on macrophages worsen the outer of infection and another side, it maintains M.tb to its dormant stage avoids activation of M.tb from latent phase. Single Nucleotide Polymorphism (SNP) of TLR-2 gene plays an important role in susceptibility to TB among different populations and subsequently, in the development of infertility. Methodology: This Case-Control study was done in the Department of Obs and Gynae and Department of Microbiology at King George’s Medical University, U.P, Lucknow, India. Total 300 subjects (150 Cases and 150 Controls) were enrolled in the study. All subjects were enrolled only after fulfilling the given inclusion and exclusion criteria. Inclusion criteria: Age 20-35 years, menstrual-irregularities, positive on Acid-Fast Bacilli (AFB), TB-PCR, (LJ/MGIT) culture in Endometrial Aspiration (EA). Exclusion criteria: Koch’s active, on ATT, PCOS, and Endometriosis fibroid women, positive on Gonococal and Chlamydia. Blood samples were collected in EDTA tubes from cases and healthy control women (HCW) and genomic DNA extraction was carried out by salting-out method. Genotyping of TLR2 genetic variants (Arg753Gln and Arg677Trp) were performed by using single amplification refractory mutation system (ARMS) PCR technique. PCR products were analyzed by electrophoresis on 1.2% agarose gel and visualized by gel-doc. Statistical analysis of the data was performed using the SPSS 16.3 software and computing odds ratio (OR) with 95% CI. Linkage Disequiliribium (LD) analysis was done by SNP stats online software. Results: In TLR-2 (Arg753Gln) polymorphism significant risk of FGTB observed with GG homozygous mutant genotype (OR=13, CI=0.71-237.7, p=0.05), AG heterozygous mutant genotype (OR=13.7, CI=0.76-248.06, p=0.03) however, G allele (OR=1.09, CI=0.78-1.52, p=0.67) individually was not associated with FGTB. In TLR-2 (Arg677Trp) polymorphism a significant risk of FGTB observed with TT homozygous mutant genotype (OR= 0.020, CI=0.001-0.341, p < 0.001), CT heterozygous mutant genotype (OR=0.53, CI=0.33-0.86, p=0.014) and T allele (OR=0.463, CI=0.32-0.66, p < 0.001). TT mutant genotype was only found in FGTB cases and frequency of CT heterozygous more in control group as compared to FGTB group. So, CT genotype worked as protective mutation for FGTB susceptibility group. In haplotype analysis of TLR-2 genetic variants, four possible combinations, i.e. (G-T, A-C, G-C, and A-T) were obtained. The frequency of haplotype A-C was significantly higher in FGTB cases (0.32). Control group did not show A-C haplotype and only found in FGTB cases. Conclusion: In conclusion, study showed a significant association with both genetic variants of TLR-2 of FGTB disease. Moreover, the presence of specific associated genotype/alleles suggest the possibility of disease severity and clinical approach aimed to prevent extensive damage by disease and also helpful for early detection of disease.

Keywords: ARMS, EDTA, FGTB, TLR

Procedia PDF Downloads 306
117 Gamification of eHealth Business Cases to Enhance Rich Learning Experience

Authors: Kari Björn

Abstract:

Introduction of games has expanded the application area of computer-aided learning tools to wide variety of age groups of learners. Serious games engage the learners into a real-world -type of simulation and potentially enrich the learning experience. Institutional background of a Bachelor’s level engineering program in Information and Communication Technology is introduced, with detailed focus on one of its majors, Health Technology. As part of a Customer Oriented Software Application thematic semester, one particular course of “eHealth Business and Solutions” is described and reflected in a gamified framework. Learning a consistent view into vast literature of business management, strategies, marketing and finance in a very limited time enforces selection of topics relevant to the industry. Health Technology is a novel and growing industry with a growing sector in consumer wearable devices and homecare applications. The business sector is attracting new entrepreneurs and impatient investor funds. From engineering education point of view the sector is driven by miniaturizing electronics, sensors and wireless applications. However, the market is highly consumer-driven and usability, safety and data integrity requirements are extremely high. When the same technology is used in analysis or treatment of patients, very strict regulatory measures are enforced. The paper introduces a course structure using gamification as a tool to learn the most essential in a new market: customer value proposition design, followed by a market entry game. Students analyze the existing market size and pricing structure of eHealth web-service market and enter the market as a steering group of their company, competing against the legacy players and with each other. The market is growing but has its rules of demand and supply balance. New products can be developed with an R&D-investment, and targeted to market with unique quality- and price-combinations. Product cost structure can be improved by investing to enhanced production capacity. Investments can be funded optionally by foreign capital. Students make management decisions and face the dynamics of the market competition in form of income statement and balance sheet after each decision cycle. The focus of the learning outcome is to understand customer value creation to be the source of cash flow. The benefit of gamification is to enrich the learning experience on structure and meaning of financial statements. The paper describes the gamification approach and discusses outcomes after two course implementations. Along the case description of learning challenges, some unexpected misconceptions are noted. Improvements of the game or the semi-gamified teaching pedagogy are discussed. The case description serves as an additional support to new game coordinator, as well as helps to improve the method. Overall, the gamified approach has helped to engage engineering student to business studies in an energizing way.

Keywords: engineering education, integrated curriculum, learning experience, learning outcomes

Procedia PDF Downloads 240
116 A Review on Cyberchondria Based on Bibliometric Analysis

Authors: Xiaoqing Peng, Aijing Luo, Yang Chen

Abstract:

Background: Cyberchondria, as an "emerging risk" accompanied by the information era, is a new abnormal pattern characterized by excessive or repeated online searches for health-related information and escalating health anxiety, which endangers people's physical and mental health and poses a huge threat to public health. Objective: To explore and discuss the research status, hotspots and trends of Cyberchondria. Methods: Based on a total of 77 articles regarding "Cyberchondria" extracted from Web of Science from the beginning till October 2019, the literature trends, countries, institutions, hotspots are analyzed by bibliometric analysis, the concept definition of Cyberchondria, instruments, relevant factors, treatment and intervention are discussed as well. Results: Since "Cyberchondria" was put forward for the first time in 2001, the last two decades witnessed a noticeable increase in the amount of literature, especially during 2014-2019, it quadrupled dramatically at 62 compared with that before 2014 only at 15, which shows that Cyberchondria has become a new theme and hot topic in recent years. The United States was the most active contributor with the largest publication (23), followed by England (11) and Australia (11), while the leading institutions were Baylor University(7) and University of Sydney(7), followed by Florida State University(4) and University of Manchester(4). The WoS categories "Psychiatry/Psychology " and "Computer/ Information Science "were the areas of greatest influence. The concept definition of Cyberchondria is not completely unified in the world, but it is generally considered as an abnormal behavioral pattern and emotional state and has been invoked to refer to the anxiety-amplifying effects of online health-related searches. The first and the most frequently cited scale for measuring the severity of Cyberchondria called “The Cyberchondria Severity Scale (CSS) ”was developed in 2014, which conceptualized Cyberchondria as a multidimensional construct consisting of compulsion, distress, excessiveness, reassurance, and mistrust of medical professionals which was proved to be not necessary for this construct later. Since then, the Brazilian, German, Turkish, Polish and Chinese versions were subsequently developed, improved and culturally adjusted, while CSS was optimized to a simplified version (CSS-12) in 2019, all of which should be worthy of further verification. The hotspots of Cyberchondria mainly focuses on relevant factors as follows: intolerance of uncertainty, anxiety sensitivity, obsessive-compulsive disorder, internet addition, abnormal illness behavior, Whiteley index, problematic internet use, trying to make clear the role played by “associated factors” and “anxiety-amplifying factors” in the development of Cyberchondria, to better understand the aetiological links and pathways in the relationships between hypochondriasis, health anxiety and online health-related searches. Although the treatment and intervention of Cyberchondria are still in the initial stage of exploration, there are kinds of meaningful attempts to seek effective strategies from different aspects such as online psychological treatment, network technology management, health information literacy improvement and public health service. Conclusion: Research on Cyberchondria is in its infancy but should be deserved more attention. A conceptual consensus on Cyberchondria, a refined assessment tool, prospective studies conducted in various populations, targeted treatments for it would be the main research direction in the near future.

Keywords: cyberchondria, hypochondriasis, health anxiety, online health-related searches

Procedia PDF Downloads 122
115 Sickle Cell Disease: Review of Managements in Pregnancy and the Outcome in Ampang Hospital, Selangor

Authors: Z. Nurzaireena, K. Azalea, T. Azirawaty, S. Jameela, G. Muralitharan

Abstract:

The aim of this study is the review of the management practices of sickle cell disease patients during pregnancy, as well as the maternal and neonatal outcome at Ampang Hospital, Selangor. The study consisted of a review of pregnant patients with sickle cell disease under follow up at the Hematology Clinic, Ampang Hospital over the last seven years to assess their management and maternal-fetal outcome. The results of the review show that Ampang Hospital is considered the public hematology centre for sickle cell disease and had successfully managed three pregnancies throughout the last seven years. Patients’ presentations, managements and maternal-fetal outcome were compared and reviewed for academic improvements. All three patients were seen very early in their pregnancy and had been given a regime of folic acid, antibiotics and thrombo-prophylactic drugs. Close monitoring of maternal and fetal well being was done by the hematologists and obstetricians. Among the patients, there were multiple admissions during the pregnancy for either a painful sickle cell bone crisis, haemolysis following an infection and anemia requiring phenotype- matched blood and exchange transfusions. Broad spectrum antibiotics coverage during and infection, hydration, pain management and venous-thrombolism prophylaxis were mandatory. The pregnancies managed to reach near term in the third trimester but all required emergency caesarean section for obstetric indications. All pregnancies resulted in live births with good fetal outcome. During post partum all were nursed closely in the high dependency units for further complications and were discharged well. Post partum follow up and contraception counseling was comprehensively given for future pregnancies. Sickle cell disease is uncommonly seen in the East, especially in the South East Asian region, yet more cases are seen in the current decade due to improved medical expertise and advance medical laboratory technologies. Pregnancy itself is a risk factor for sickle cell patients as increased thrombosis event and risk of infections can lead to multiple crisis, haemolysis, anemia and vaso-occlusive complications including eclampsia, cerebrovasular accidents and acute bone pain. Patients mostly require multiple blood product transfusions thus phenotype-matched blood is required to reduce the risk of alloimmunozation. Emphasizing the risks and complications in preconception counseling and establishing an ultimate pregnancy plan would probably reduce the risk of morbidity and mortality to the mother and unborn child. Early management for risk of infection, thromboembolic events and adequate hydration is mandatory. A holistic approach involving multidisciplinary team care between the hematologist, obstetricians, anesthetist, neonatologist and close nursing care for both mother and baby would ensure the best outcome. In conclusion, sickle cell disease by itself is a high risk medical condition and pregnancy would further amplify the risk. Thus, close monitoring with combine multidisciplinary care, counseling and educating the patients are crucial in achieving the safe outcome.

Keywords: anaemia, haemoglobinopathies, pregnancy, sickle cell disease

Procedia PDF Downloads 258
114 The Impact of an Improved Strategic Partnership Programme on Organisational Performance and Growth of Firms in the Internet Protocol Television and Hybrid Fibre-Coaxial Broadband Industry

Authors: Collen T. Masilo, Brane Semolic, Pieter Steyn

Abstract:

The Internet Protocol Television (IPTV) and Hybrid Fibre-Coaxial (HFC) Broadband industrial sector landscape are rapidly changing and organisations within the industry need to stay competitive by exploring new business models so that they can be able to offer new services and products to customers. The business challenge in this industrial sector is meeting or exceeding high customer expectations across multiple content delivery modes. The increasing challenges in the IPTV and HFC broadband industrial sector encourage service providers to form strategic partnerships with key suppliers, marketing partners, advertisers, and technology partners. The need to form enterprise collaborative networks poses a challenge for any organisation in this sector, in selecting the right strategic partners who will ensure that the organisation’s services and products are marketed in new markets. Partners who will ensure that customers are efficiently supported by meeting and exceeding their expectations. Lastly, selecting cooperation partners who will represent the organisation in a positive manner, and contribute to improving the performance of the organisation. Companies in the IPTV and HFC broadband industrial sector tend to form informal partnerships with suppliers, vendors, system integrators and technology partners. Generally, partnerships are formed without thorough analysis of the real reason a company is forming collaborations, without proper evaluations of prospective partners using specific selection criteria, and with ineffective performance monitoring of partners to ensure that a firm gains real long term benefits from its partners and gains competitive advantage. Similar tendencies are illustrated in the research case study and are based on Skyline Communications, a global leader in end-to-end, multi-vendor network management and operational support systems (OSS) solutions. The organisation’s flagship product is the DataMiner network management platform used by many operators across multiple industries and can be referred to as a smart system that intelligently manages complex technology ecosystems for its customers in the IPTV and HFC broadband industry. The approach of the research is to develop the most efficient business model that can be deployed to improve a strategic partnership programme in order to significantly improve the performance and growth of organisations participating in a collaborative network in the IPTV and HFC broadband industrial sector. This involves proposing and implementing a new strategic partnership model and its main features within the industry which should bring about significant benefits for all involved companies to achieve value add and an optimal growth strategy. The proposed business model has been developed based on the research of existing relationships, value chains and business requirements in this industrial sector and validated in 'Skyline Communications'. The outputs of the business model have been demonstrated and evaluated in the research business case study the IPTV and HFC broadband service provider 'Skyline Communications'.

Keywords: growth, partnership, selection criteria, value chain

Procedia PDF Downloads 133
113 Operational Characteristics of the Road Surface Improvement

Authors: Iuri Salukvadze

Abstract:

Construction takes importance role in the history of mankind, there is not a single thing-product in our lives in which the builder’s work was not to be materialized, because to create all of it requires setting up factories, roads, and bridges, etc. The function of the Republic of Georgia, as part of the connecting Europe-Asia transport corridor, is significantly increased. In the context of transit function a large part of the cargo traffic belongs to motor transport, hence the improvement of motor roads transport infrastructure is rather important and rise the new, increased operational demands for existing as well as new motor roads. Construction of the durable road surface is related to rather large values, but because of high transport-operational properties, such as high-speed, less fuel consumption, less depreciation of tires, etc. If the traffic intensity is high, therefore the reimbursement of expenses occurs rapidly and accordingly is increasing income. If the traffic intensity is relatively small, it is recommended to use lightened structures of road carpet in order to pay for capital investments amounted to no more than normative one. The road carpet is divided into the following basic types: asphaltic concrete and cement concrete. Asphaltic concrete is the most perfect type of road carpet. It is arranged in two or three layers on rigid foundation and will be compacted. Asphaltic concrete is artificial building material, which due stratum will be selected and measured from stone skeleton and sand, interconnected by bitumen and a mixture of mineral powder. Less strictly selected similar material is called as bitumen-mineral mixture. Asphaltic concrete is non-rigid building material and well durable on vertical loadings; it is less resistant to the impact of horizontal forces. The cement concrete is monolithic and durable material, it is well durable the horizontal loads and is less resistant related to vertical loads. The cement concrete consists from strictly selected, measured stone material and sand, the binder is cement. The cement concrete road carpet represents separate slabs of sizes from 3 ÷ 5 op to 6 ÷ 8 meters. The slabs are reinforced by a rather complex system. Between the slabs are arranged seams that are designed for avoiding of additional stresses due temperature fluctuations on the length of slabs. For the joint behavior of separate slabs, they are connected by metal rods. Rods provide the changes in the length of slabs and distribute to the slab vertical forces and bending moments. The foundation layers will be extremely durable, for that is required high-quality stone material, cement, and metal. The qualification work aims to: in order for improvement of traffic conditions on motor roads to prolong operational conditions and improving their characteristics. The work consists from three chapters, 80 pages, 5 tables and 5 figures. In the work are stated general concepts as well as carried out by various companies using modern methods tests and their results. In the chapter III are stated carried by us tests related to this issue and specific examples to improving the operational characteristics.

Keywords: asphalt, cement, cylindrikal sample of asphalt, building

Procedia PDF Downloads 223
112 The Role of Temples Redevelopment for Informal Sector Business Development in India

Authors: Prashant Gupta

Abstract:

Throughout India, temples have served as cultural centers, commerce hubs, art galleries, educational institutions, and social centers in addition to being places of worship since centuries. Across the country, there are over two million temples, which are crucial economic hubs, attracting devotees and tourists worldwide. In India, we have 53 temples per each 100,000 Indians. As per NSSO survey, the temple economy is worth about $40 billion and 2.32 per cent of GDP based on major temple’s survey, which only includes formal sector. It could be much larger as an actual estimation has not been done yet. In India, 43.1% of total economy represents informal sector. Over 10 billion domestic tourists visit to new destinations every year within India. Even 20 per cent of the 90 million foreign tourists visited Madurai and Mahabalipuram temples which became the most visited tourist spot in 2022. Recently the current central government in power have started revitalizing the ancient Indian civilization by reconstructing and beautifying the major temples of India i.e., Kashi Vishwanath Corridor, Mahakaleshwara Temple, Kedarnath, Ayodhya etc. The reason researcher chose Kashi as a case study because it is known as a Spiritual Capital of India, which is also the abode for the spread of Hinduism, Buddhism, Jainism and Sikkism, which are core Sanatan Dharmic practices. 17,800 Million INR Amount was spend to redevelop Kashi Vishwanath Corridor since 2019. RESEARCH OBJECTIVES 1. To assess historical contribution of temples in socio economic development and revival of Indic Civilization. 2. To examine the role of temples redevelopment for informal sector businesses. 3. To identify the sub-sectors of informal sector businesses 4. To identify products and services of informal businesses for investigation of marketing strategies and business development. PROPOSED METHODS AND PROCEDURES This study will follow a mixed approach, employing both qualitative and quantitative methods of research. To conduct the study, data will be collected from 500 informal business owners through structured questionnaire and interview instruments. The informal business owners will be selected using a systematic random sampling technique. In addition, documents from government offices of the last 10 years of tax collection will be reviewed to substantiate the study. To analyze the study, descriptive and econometric analysis techniques will be employed. EXPECTED CONTRIBUTION OF THE PROPOSED STUDY By studying the contribution of temple re-development on informal business creation and growth, the study will be beneficial to the informal business owners and the government. For the government, scientific and empirical evidence on the contribution of temple re-development for informal business creation and growth to give evidence the study will give based infrastructural development and boosting tax collection. For informal businesses, the study will give them a detailed insight on the nature of their business and the possible future growth potential of their business, and the alternative products and services supplying to their customers in the future. Studying informal businesses will help to identify the key products and services which are majorly profitable and possess potential to multiply and grow through correct product marketing strategies and business development.

Keywords: business development, informal sector businesses, services and products marketing, temple economics

Procedia PDF Downloads 80
111 Electrochemical Activity of NiCo-GDC Cermet Anode for Solid Oxide Fuel Cells Operated in Methane

Authors: Kamolvara Sirisuksakulchai, Soamwadee Chaianansutcharit, Kazunori Sato

Abstract:

Solid Oxide Fuel Cells (SOFCs) have been considered as one of the most efficient large unit power generators for household and industrial applications. The efficiency of an electronic cell depends mainly on the electrochemical reactions in the anode. The development of anode materials has been intensely studied to achieve higher kinetic rates of redox reactions and lower internal resistance. Recent studies have introduced an efficient cermet (ceramic-metallic) material for its ability in fuel oxidation and oxide conduction. This could expand the reactive site, also known as the triple-phase boundary (TPB), thus increasing the overall performance. In this study, a bimetallic catalyst Ni₀.₇₅Co₀.₂₅Oₓ was combined with Gd₀.₁Ce₀.₉O₁.₉₅ (GDC) to be used as a cermet anode (NiCo-GDC) for an anode-supported type SOFC. The synthesis of Ni₀.₇₅Co₀.₂₅Oₓ was carried out by ball milling NiO and Co3O4 powders in ethanol and calcined at 1000 °C. The Gd₀.₁Ce₀.₉O₁.₉₅ was prepared by a urea co-precipitation method. Precursors of Gd(NO₃)₃·6H₂O and Ce(NO₃)₃·6H₂O were dissolved in distilled water with the addition of urea and were heated subsequently. The heated mixture product was filtered and rinsed thoroughly, then dried and calcined at 800 °C and 1500 °C, respectively. The two powders were combined followed by pelletization and sintering at 1100 °C to form an anode support layer. The fabrications of an electrolyte layer and cathode layer were conducted. The electrochemical performance in H₂ was measured from 800 °C to 600 °C while for CH₄ was from 750 °C to 600 °C. The maximum power density at 750 °C in H₂ was 13% higher than in CH₄. The difference in performance was due to higher polarization resistances confirmed by the impedance spectra. According to the standard enthalpy, the dissociation energy of C-H bonds in CH₄ is slightly higher than the H-H bond H₂. The dissociation of CH₄ could be the cause of resistance within the anode material. The results from lower temperatures showed a descending trend of power density in relevance to the increased polarization resistance. This was due to lowering conductivity when the temperature decreases. The long-term stability was measured at 750 °C in CH₄ monitoring at 12-hour intervals. The maximum power density tends to increase gradually with time while the resistances were maintained. This suggests the enhanced stability from charge transfer activities in doped ceria due to the transition of Ce⁴⁺ ↔ Ce³⁺ at low oxygen partial pressure and high-temperature atmosphere. However, the power density started to drop after 60 h, and the cell potential also dropped from 0.3249 V to 0.2850 V. These phenomena was confirmed by a shifted impedance spectra indicating a higher ohmic resistance. The observation by FESEM and EDX-mapping suggests the degradation due to mass transport of ions in the electrolyte while the anode microstructure was still maintained. In summary, the electrochemical test and stability test for 60 h was achieved by NiCo-GDC cermet anode. Coke deposition was not detected after operation in CH₄, hence this confirms the superior properties of the bimetallic cermet anode over typical Ni-GDC.

Keywords: bimetallic catalyst, ceria-based SOFCs, methane oxidation, solid oxide fuel cell

Procedia PDF Downloads 154
110 Implementation of Cord- Blood Derived Stem Cells in the Regeneration of Two Experimental Models: Carbon Tetrachloride and S. Mansoni Induced Liver Fibrosis

Authors: Manal M. Kame, Zeinab A. Demerdash, Hanan G. El-Baz, Salwa M. Hassan, Faten M. Salah, Wafaa Mansour, Olfat Hammam

Abstract:

Cord blood (CB) derived Unrestricted Somatic Stem Cells (USSCs) with their multipotentiality hold great promise in liver regeneration. This work aims at evaluation of the therapeutic potentiality of USSCs in two experimental models of chronic liver injury induced either by S. mansoni infection in balb/c mice or CCL4 injection in hamsters. Isolation, propagation, and characterization of USSCs from CB samples were performed. USSCs were induced to differentiate into osteoblasts, adipocytes and hepatocyte-like cells. Cells of the third passage were transplanted in two models of liver fibrosis: (1) Twenty hamsters were induced to liver fibrosis by repeated i. p. injection of 100 μl CCl4 /hamster for 8 weeks. This model was designed as; 10 hamsters with liver fibrosis and treated with i.h. injection of 3x106 USSCs (USSCs transplanted group), 10 hamsters with liver fibrosis (pathological control group), and 10 hamsters with healthy livers (normal control group). (2) Murine chronics S.mansoni model: twenty mice were induced to liver fibrosis with S. mansoni ceracariae (60 cercariae/ mouse) using the tail immersion method and left for 12 weeks. This model was designed as; 10 mice with liver fibrosis were transplanted with i. v. injection of 1×106 USCCs (USSCs transplanted group). Other 2 groups were designed as in hamsters model. Animals were sacrificed 12 weeks after USSCs transplantation, and their liver sections were examined for detection of human hepatocyte-like cells by immunohistochemistry staining. Moreover, liver sections were examined for fibrosis level, and fibrotic indices were calculated. Sera of sacrificed animals were tested for liver functions. CB USSCs, with fibroblast-like morphology, expressed high levels of CD44, CD90, CD73 and CD105 and were negative for CD34, CD45, and HLA-DR. USSCs showed high expression of transcripts for Oct4 and Sox2 and were in vitro differentiated into osteoblasts, adipocytes. In both animal models, in vitro induced hepatocyte-like cells were confirmed by cytoplasmic expression of glycogen, alpha-fetoprotein, and cytokeratin18. Livers of USSCs transplanted group showed engraftment with human hepatocyte-like cells as proved by cytoplasmic expression of human alpha-fetoprotein, cytokeratin18, and OV6. In addition, livers of this group showed less fibrosis than the pathological control group. Liver functions in the form of serum AST & ALT level and serum total bilirubin level were significantly lowered in USSCs transplanted group than pathological control group (p < 0.001). Moreover, the fibrotic index was significantly lower (p< 0.001) in USSCs transplanted group than pathological control group. In addition liver sections, of i. v. injection of 1×106 USCCs of mice, stained with either H&E or sirius red showed diminished granuloma size and a relative decrease in hepatic fibrosis. Our experimental liver fibrosis models transplanted with CB-USSCs showed liver engraftment with human hepatocyte-like cells as well as signs of liver regeneration in the form of improvement in liver function assays and fibrosis level. These data provide hope that human CB- derived USSCs are introduced as multipotent stem cells with great potentiality in regenerative medicine & strengthens the concept of cellular therapy for the treatment of liver fibrosis.

Keywords: cord blood, liver fibrosis, stem cells, transplantation

Procedia PDF Downloads 309
109 Integrating Data Mining within a Strategic Knowledge Management Framework: A Platform for Sustainable Competitive Advantage within the Australian Minerals and Metals Mining Sector

Authors: Sanaz Moayer, Fang Huang, Scott Gardner

Abstract:

In the highly leveraged business world of today, an organisation’s success depends on how it can manage and organize its traditional and intangible assets. In the knowledge-based economy, knowledge as a valuable asset gives enduring capability to firms competing in rapidly shifting global markets. It can be argued that ability to create unique knowledge assets by configuring ICT and human capabilities, will be a defining factor for international competitive advantage in the mid-21st century. The concept of KM is recognized in the strategy literature, and increasingly by senior decision-makers (particularly in large firms which can achieve scalable benefits), as an important vehicle for stimulating innovation and organisational performance in the knowledge economy. This thinking has been evident in professional services and other knowledge intensive industries for over a decade. It highlights the importance of social capital and the value of the intellectual capital embedded in social and professional networks, complementing the traditional focus on creation of intellectual property assets. Despite the growing interest in KM within professional services there has been limited discussion in relation to multinational resource based industries such as mining and petroleum where the focus has been principally on global portfolio optimization with economies of scale, process efficiencies and cost reduction. The Australian minerals and metals mining industry, although traditionally viewed as capital intensive, employs a significant number of knowledge workers notably- engineers, geologists, highly skilled technicians, legal, finance, accounting, ICT and contracts specialists working in projects or functions, representing potential knowledge silos within the organisation. This silo effect arguably inhibits knowledge sharing and retention by disaggregating corporate memory, with increased operational and project continuity risk. It also may limit the potential for process, product, and service innovation. In this paper the strategic application of knowledge management incorporating contemporary ICT platforms and data mining practices is explored as an important enabler for knowledge discovery, reduction of risk, and retention of corporate knowledge in resource based industries. With reference to the relevant strategy, management, and information systems literature, this paper highlights possible connections (currently undergoing empirical testing), between an Strategic Knowledge Management (SKM) framework incorporating supportive Data Mining (DM) practices and competitive advantage for multinational firms operating within the Australian resource sector. We also propose based on a review of the relevant literature that more effective management of soft and hard systems knowledge is crucial for major Australian firms in all sectors seeking to improve organisational performance through the human and technological capability captured in organisational networks.

Keywords: competitive advantage, data mining, mining organisation, strategic knowledge management

Procedia PDF Downloads 415
108 A Proposed Treatment Protocol for the Management of Pars Interarticularis Pathology in Children and Adolescents

Authors: Paul Licina, Emma M. Johnston, David Lisle, Mark Young, Chris Brady

Abstract:

Background: Lumbar pars pathology is a common cause of pain in the growing spine. It can be seen in young athletes participating in at-risk sports and can affect sporting performance and long-term health due to its resistance to traditional management. There is a current lack of consensus of classification and treatment for pars injuries. Previous systems used CT to stage pars defects but could not assess early stress reactions. A modified classification is proposed that considers findings on MRI, significantly improving early treatment guidance. The treatment protocol is designed for patients aged 5 to 19 years. Method: Clinical screening identifies patients with a low, medium, or high index of suspicion for lumbar pars injury using patient age, sport participation and pain characteristics. MRI of the at-risk cohort enables augmentation of existing CT-based classification while avoiding ionising radiation. Patients are classified into five categories based on MRI findings. A type 0 lesion (stress reaction) is present when CT is normal and MRI shows high signal change (HSC) in the pars/pedicle on T2 images. A type 1 lesion represents the ‘early defect’ CT classification. The group previously referred to as a 'progressive stage' defect on CT can be split into 2A and 2B categories. 2As have HSC on MRI, whereas 2Bs do not. This distinction is important with regard to healing potential. Type 3 lesions are terminal stage defects on CT, characterised by pseudarthrosis. MRI shows no HSC. Results: Stress reactions (type 0) and acute fractures (1 and 2a) can heal and are treated in a custom-made hard brace for 12 weeks. It is initially worn 23 hours per day. At three weeks, patients commence basic core rehabilitation. At six weeks, in the absence of pain, the brace is removed for sleeping. Exercises are progressed to positions of daily living. Patients with continued pain remain braced 23 hours per day without exercise progression until becoming symptom-free. At nine weeks, patients commence supervised exercises out of the brace for 30 minutes each day. This allows them to re-learn muscular control without rigid support of the brace. At 12 weeks, bracing ceases and MRI is repeated. For patients with near or complete resolution of bony oedema and healing of any cortical defect, rehabilitation is focused on strength and conditioning and sport-specific exercise for the full return to activity. The length of this final stage is approximately nine weeks but depends on factors such as development and level of sports participation. If significant HSC remains on MRI, CT scan is considered to definitively assess cortical defect healing. For these patients, return to high-risk sports is delayed for up to three months. Chronic defects (2b and 3) cannot heal and are not braced, and rehabilitation follows traditional protocols. Conclusion: Appropriate clinical screening and imaging with MRI can identify pars pathology early. In those with potential for healing, we propose hard bracing and appropriate rehabilitation as part of a multidisciplinary management protocol. The validity of this protocol will be tested in future studies.

Keywords: adolescents, MRI classification, pars interticularis, treatment protocol

Procedia PDF Downloads 153
107 A Bibliometric Analysis of Ukrainian Research Articles on SARS-COV-2 (COVID-19) in Compliance with the Standards of Current Research Information Systems

Authors: Sabina Auhunas

Abstract:

These days in Ukraine, Open Science dramatically develops for the sake of scientists of all branches, providing an opportunity to take a more close look on the studies by foreign scientists, as well as to deliver their own scientific data to national and international journals. However, when it comes to the generalization of data on science activities by Ukrainian scientists, these data are often integrated into E-systems that operate inconsistent and barely related information sources. In order to resolve these issues, developed countries productively use E-systems, designed to store and manage research data, such as Current Research Information Systems that enable combining uncompiled data obtained from different sources. An algorithm for selecting SARS-CoV-2 research articles was designed, by means of which we collected the set of papers published by Ukrainian scientists and uploaded by August 1, 2020. Resulting metadata (document type, open access status, citation count, h-index, most cited documents, international research funding, author counts, the bibliographic relationship of journals) were taken from Scopus and Web of Science databases. The study also considered the info from COVID-19/SARS-CoV-2-related documents published from December 2019 to September 2020, directly from documents published by authors depending on territorial affiliation to Ukraine. These databases are enabled to get the necessary information for bibliometric analysis and necessary details: copyright, which may not be available in other databases (e.g., Science Direct). Search criteria and results for each online database were considered according to the WHO classification of the virus and the disease caused by this virus and represented (Table 1). First, we identified 89 research papers that provided us with the final data set after consolidation and removing duplication; however, only 56 papers were used for the analysis. The total number of documents by results from the WoS database came out at 21641 documents (48 affiliated to Ukraine among them) in the Scopus database came out at 32478 documents (41 affiliated to Ukraine among them). According to the publication activity of Ukrainian scientists, the following areas prevailed: Education, educational research (9 documents, 20.58%); Social Sciences, interdisciplinary (6 documents, 11.76%) and Economics (4 documents, 8.82%). The highest publication activity by institution types was reported in the Ministry of Education and Science of Ukraine (its percent of published scientific papers equals 36% or 7 documents), Danylo Halytsky Lviv National Medical University goes next (5 documents, 15%) and P. L. Shupyk National Medical Academy of Postgraduate Education (4 documents, 12%). Basically, research activities by Ukrainian scientists were funded by 5 entities: Belgian Development Cooperation, the National Institutes of Health (NIH, U.S.), The United States Department of Health & Human Services, grant from the Whitney and Betty MacMillan Center for International and Area Studies at Yale, a grant from the Yale Women Faculty Forum. Based on the results of the analysis, we obtained a set of published articles and preprints to be assessed on the variety of features in upcoming studies, including citation count, most cited documents, a bibliographic relationship of journals, reference linking. Further research on the development of the national scientific E-database continues using brand new analytical methods.

Keywords: content analysis, COVID-19, scientometrics, text mining

Procedia PDF Downloads 115
106 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence

Authors: Gus Calderon, Richard McCreight, Tammy Schwartz

Abstract:

Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.

Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.

Procedia PDF Downloads 108
105 A Magnetic Hydrochar Nanocomposite as a Potential Adsorbent of Emerging Pollutants

Authors: Aura Alejandra Burbano Patino, Mariela Agotegaray, Veronica Lassalle, Fernanda Horst

Abstract:

Water pollution is of worldwide concern due to its importance as an essential resource for life. Industrial and urbanistic growth are anthropogenic activities that have caused an increase of undesirable compounds in water. In the last decade, emerging pollutants have become of great interest since, at very low concentrations (µg/L and ng/L), they exhibit a hazardous effect on wildlife, aquatic ecosystems, and human organisms. One group of emerging pollutants that are a matter of study are pharmaceuticals. Their high consumption rate and their inappropriate disposal have led to their detection in wastewater treatment plant influent, effluent, surface water, and drinking water. In consequence, numerous technologies have been developed to efficiently treat these pollutants. Adsorption appears like an easy and cost-effective technology. One of the most used adsorbents of emerging pollutants removal is carbon-based materials such as hydrochars. This study aims to use a magnetic hydrochar nanocomposite to be employed as an adsorbent for diclofenac removal. Kinetics models and the adsorption efficiency in real water samples were analyzed. For this purpose, a magnetic hydrochar nanocomposite was synthesized through the hydrothermal carbonization (HTC) technique hybridized to co-precipitation to add the magnetic component into the hydrochar, based on iron oxide nanoparticles. The hydrochar was obtained from sunflower husk residue as the precursor. TEM, TGA, FTIR, Zeta potential as a function of pH, DLS, BET technique, and elemental analysis were employed to characterize the material in terms of composition and chemical structure. Adsorption kinetics were carried out in distilled water and real water at room temperature, pH of 5.5 for distilled water and natural pH for real water samples, 1:1 adsorbent: adsorbate dosage ratio, contact times from 10-120 minutes, and 50% dosage concentration of DCF. Results have demonstrated that magnetic hydrochar presents superparamagnetic properties with a saturation magnetization value of 55.28 emu/g. Besides, it is mesoporous with a surface area of 55.52 m²/g. It is composed of magnetite nanoparticles incorporated into the hydrochar matrix, as can be proven by TEM micrographs, FTIR spectra, and zeta potential. On the other hand, kinetic studies were carried out using DCF models, finding percent removal efficiencies up to 85.34% after 80 minutes of contact time. In addition, after 120 minutes of contact time, desorption of emerging pollutants from active sites took place, which indicated that the material got saturated after that t time. In real water samples, percent removal efficiencies decrease up to 57.39%, ascribable to a possible mechanism of competitive adsorption of organic or inorganic compounds, ions for active sites of the magnetic hydrochar. The main suggested adsorption mechanism between the magnetic hydrochar and diclofenac include hydrophobic and electrostatic interactions as well as hydrogen bonds. It can be concluded that the magnetic hydrochar nanocomposite could be valorized into a by-product which appears as an efficient adsorbent for DCF removal as a model emerging pollutant. These results are being complemented by modifying experimental variables such as pollutant’s initial concentration, adsorbent: adsorbate dosage ratio, and temperature. Currently, adsorption assays of other emerging pollutants are being been carried out.

Keywords: environmental remediation, emerging pollutants, hydrochar, magnetite nanoparticles

Procedia PDF Downloads 189
104 The Development of Congeneric Elicited Writing Tasks to Capture Language Decline in Alzheimer Patients

Authors: Lise Paesen, Marielle Leijten

Abstract:

People diagnosed with probable Alzheimer disease suffer from an impairment of their language capacities; a gradual impairment which affects both their spoken and written communication. Our study aims at characterising the language decline in DAT patients with the use of congeneric elicited writing tasks. Within these tasks, a descriptive text has to be written based upon images with which the participants are confronted. A randomised set of images allows us to present the participants with a different task on every encounter, thus allowing us to avoid a recognition effect in this iterative study. This method is a revision from previous studies, in which participants were presented with a larger picture depicting an entire scene. In order to create the randomised set of images, existing pictures were adapted following strict criteria (e.g. frequency, AoA, colour, ...). The resulting data set contained 50 images, belonging to several categories (vehicles, animals, humans, and objects). A pre-test was constructed to validate the created picture set; most images had been used before in spoken picture naming tasks. Hence the same reaction times ought to be triggered in the typed picture naming task. Once validated, the effectiveness of the descriptive tasks was assessed. First, the participants (n=60 students, n=40 healthy elderly) performed a typing task, which provided information about the typing speed of each individual. Secondly, two descriptive writing tasks were carried out, one simple and one complex. The simple task contains 4 images (1 animal, 2 objects, 1 vehicle) and only contains elements with high frequency, a young AoA (<6 years), and fast reaction times. Slow reaction times, a later AoA (≥ 6 years) and low frequency were criteria for the complex task. This task uses 6 images (2 animals, 1 human, 2 objects and 1 vehicle). The data were collected with the keystroke logging programme Inputlog. Keystroke logging tools log and time stamp keystroke activity to reconstruct and describe text production processes. The data were analysed using a selection of writing process and product variables, such as general writing process measures, detailed pause analysis, linguistic analysis, and text length. As a covariate, the intrapersonal interkey transition times from the typing task were taken into account. The pre-test indicated that the new images lead to similar or even faster reaction times compared to the original images. All the images were therefore used in the main study. The produced texts of the description tasks were significantly longer compared to previous studies, providing sufficient text and process data for analyses. Preliminary analysis shows that the amount of words produced differed significantly between the healthy elderly and the students, as did the mean length of production bursts, even though both groups needed the same time to produce their texts. However, the elderly took significantly more time to produce the complex task than the simple task. Nevertheless, the amount of words per minute remained comparable between simple and complex. The pauses within and before words varied, even when taking personal typing abilities (obtained by the typing task) into account.

Keywords: Alzheimer's disease, experimental design, language decline, writing process

Procedia PDF Downloads 274
103 A Rare Case of Dissection of Cervical Portion of Internal Carotid Artery, Diagnosed Postpartum

Authors: Bidisha Chatterjee, Sonal Grover, Rekha Gurung

Abstract:

Postpartum dissection of the internal carotid artery is a relatively rare condition and is considered as an underlying aetiology in 5% to 25% of strokes under the age of 30 to 45 years. However, 86% of these cases recover completely and 14% have mild focal neurological symptoms. Prognosis is generally good with early intervention. The risk quoted for a repeat carotid artery dissection in subsequent pregnancies is less than 2%. 36-year Caucasian primipara presented on postnatal day one of forceps delivery with tachycardia. In the intrapartum period she had a history of prolonged rupture of membranes and developed intrapartum sepsis and was treated with antibiotics. Postpartum ECG showed septal inferior T wave inversion and a troponin level of 19. Subsequently Echocardiogram ruled out post-partum cardiomyopathy. Repeat ECG showed improvement of the previous changes and in the absence of symptoms no intervention was warranted. On day 4 post-delivery, she had developed symptoms of droopy right eyelid, pain around the right eye and itching in the right ear. On examination, she had developed right sided ptosis, unequal pupils (Rt miotic pupil). Cranial nerve examination, reflexes, sensory examination and muscle power was normal. Apart from migraine, there was no medical or family history of note. In view of Horner’s on the right, she had a CT Angiogram and subsequently MR/MRA and was diagnosed with dissection of the cervical portion of the right internal carotid artery. She was discharged on a course of Aspirin 75mg. By 6 week post-natal follow up patient had recovered significantly with occasional episodes of unequal pupils and tingling of right toes which resolved spontaneously. Cervical artery dissection, including VAD and carotid artery dissection, are rare complications of pregnancy with an estimated annual incidence of 2.6–3 per 100,000 pregnancy hospitalizations. Aetiology remains unclear though trauma during straining at labour, underlying arterial disease and preeclampsia have been implicated. Hypercoagulable state during pregnancy and puerperium could also be an important factor. 60-90% cases present with severe headache and neck pain and generally precede neurological symptoms like ipsilateral Horner’s syndrome, retroorbital pain, tinnitus and cranial nerve palsy. Although rare, the consequences of delayed diagnosis and management can lead to severe and permanent neurological deficits. Patients with a strong index of suspicion should undergo an MRI or MRA of head and neck. Antithrombotic and antiplatelet therapy forms the mainstay of therapy with selected cases needing endovascular stenting. Long term prognosis is favourable with either complete resolution or minimal deficit if treatment is prompt. Patients should be counselled about the recurrence risk and possibility of stroke in future pregnancy. Coronary artery dissection is rare and treatable but needs early diagnosis and treatment. Post-partum headache and neck pain with neurological symptoms should prompt urgent imaging followed by antithrombotic and /or antiplatelet therapy. Most cases resolve completely or with minimal sequelae.

Keywords: postpartum, dissection of internal carotid artery, magnetic resonance angiogram, magnetic resonance imaging, antiplatelet, antithrombotic

Procedia PDF Downloads 97
102 Implementation of Smart Card Automatic Fare Collection Technology in Small Transit Agencies for Standards Development

Authors: Walter E. Allen, Robert D. Murray

Abstract:

Many large transit agencies have adopted RFID technology and electronic automatic fare collection (AFC) or smart card systems, but small and rural agencies remain tied to obsolete manual, cash-based fare collection. Small countries or transit agencies can benefit from the implementation of smart card AFC technology with the promise of increased passenger convenience, added passenger satisfaction and improved agency efficiency. For transit agencies, it reduces revenue loss, improves passenger flow and bus stop data. For countries, further implementation into security, distribution of social services or currency transactions can provide greater benefits. However, small countries or transit agencies cannot afford expensive proprietary smart card solutions typically offered by the major system suppliers. Deployment of Contactless Fare Media System (CFMS) Standard eliminates the proprietary solution, ultimately lowering the cost of implementation. Acumen Building Enterprise, Inc. chose the Yuma County Intergovernmental Public Transportation Authority (YCIPTA) existing proprietary YCAT smart card system to implement CFMS. The revised system enables the purchase of fare product online with prepaid debit or credit cards using the Payment Gateway Processor. Open and interoperable smart card standards for transit have been developed. During the 90-day Pilot Operation conducted, the transit agency gathered the data from the bus AcuFare 200 Card Reader, loads (copies) the data to a USB Thumb Drive and uploads the data to the Acumen Host Processing Center for consolidation of the data into the transit agency master data file. The transition from the existing proprietary smart card data format to the new CFMS smart card data format was transparent to the transit agency cardholders. It was proven that open standards and interoperability design can work and reduce both implementation and operational costs for small transit agencies or countries looking to expand smart card technology. Acumen was able to avoid the implementation of the Payment Card Industry (PCI) Data Security Standards (DSS) which is expensive to develop and costly to operate on a continuing basis. Due to the substantial additional complexities of implementation and the variety of options presented to the transit agency cardholder, Acumen chose to implement only the Directed Autoload. To improve the implementation efficiency and the results for a similar undertaking, it should be considered that some passengers lack credit cards and are averse to technology. There are more than 1,300 small and rural agencies in the United States. This grows by 10 fold when considering small countries or rural locations throughout Latin American and the world. Acumen is evaluating additional countries, sites or transit agency that can benefit from the smart card systems. Frequently, payment card systems require extensive security procedures for implementation. The Project demonstrated the ability to purchase fare value, rides and passes with credit cards on the internet at a reasonable cost without highly complex security requirements.

Keywords: automatic fare collection, near field communication, small transit agencies, smart cards

Procedia PDF Downloads 283