Search results for: abstract mathematical concepts
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3881

Search results for: abstract mathematical concepts

311 Exploring the Design of Prospective Human Immunodeficiency Virus Type 1 Reverse Transcriptase Inhibitors through a Comprehensive Approach of Quantitative Structure Activity Relationship Study, Molecular Docking, and Molecular Dynamics Simulations

Authors: Mouna Baassi, Mohamed Moussaoui, Sanchaita Rajkhowa, Hatim Soufi, Said Belaaouad

Abstract:

The objective of this paper is to address the challenging task of targeting Human Immunodeficiency Virus type 1 Reverse Transcriptase (HIV-1 RT) in the treatment of AIDS. Reverse Transcriptase inhibitors (RTIs) have limitations due to the development of Reverse Transcriptase mutations that lead to treatment resistance. In this study, a combination of statistical analysis and bioinformatics tools was adopted to develop a mathematical model that relates the structure of compounds to their inhibitory activities against HIV-1 Reverse Transcriptase. Our approach was based on a series of compounds recognized for their HIV-1 RT enzymatic inhibitory activities. These compounds were designed via software, with their descriptors computed using multiple tools. The most statistically promising model was chosen, and its domain of application was ascertained. Furthermore, compounds exhibiting comparable biological activity to existing drugs were identified as potential inhibitors of HIV-1 RT. The compounds underwent evaluation based on their chemical absorption, distribution, metabolism, excretion, toxicity properties, and adherence to Lipinski's rule. Molecular docking techniques were employed to examine the interaction between the Reverse Transcriptase (Wild Type and Mutant Type) and the ligands, including a known drug available in the market. Molecular dynamics simulations were also conducted to assess the stability of the RT-ligand complexes. Our results reveal some of the new compounds as promising candidates for effectively inhibiting HIV-1 Reverse Transcriptase, matching the potency of the established drug. This necessitates further experimental validation. This study, beyond its immediate results, provides a methodological foundation for future endeavors aiming to discover and design new inhibitors targeting HIV-1 Reverse Transcriptase.

Keywords: QSAR, ADMET properties, molecular docking, molecular dynamics simulation, reverse transcriptase inhibitors, HIV type 1

Procedia PDF Downloads 64
310 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology

Authors: Amarendar Reddy Addula

Abstract:

Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.

Keywords: artificial intelligence, ethics & human rights issues, laws, international laws

Procedia PDF Downloads 76
309 A Contemporary Advertising Strategy on Social Networking Sites

Authors: M. S. Aparna, Pushparaj Shetty D.

Abstract:

Nowadays social networking sites have become so popular that the producers or the sellers look for these sites as one of the best options to target the right audience to market their products. There are several tools available to monitor or analyze the social networks. Our task is to identify the right community web pages and find out the behavior analysis of the members by using these tools and formulate an appropriate strategy to market the products or services to achieve the set goals. The advertising becomes more effective when the information of the product/ services come from a known source. The strategy explores great buying influence in the audience on referral marketing. Our methodology proceeds with critical budget analysis and promotes viral influence propagation. In this context, we encompass the vital bits of budget evaluation such as the number of optimal seed nodes or primary influential users activated onset, an estimate coverage spread of nodes and maximum influence propagating distance from an initial seed to an end node. Our proposal for Buyer Prediction mathematical model arises from the urge to perform complex analysis when the probability density estimates of reliable factors are not known or difficult to calculate. Order Statistics and Buyer Prediction mapping function guarantee the selection of optimal influential users at each level. We exercise an efficient tactics of practicing community pages and user behavior to determine the product enthusiasts on social networks. Our approach is promising and should be an elementary choice when there is little or no prior knowledge on the distribution of potential buyers on social networks. In this strategy, product news propagates to influential users on or surrounding networks. By applying the same technique, a user can search friends who are capable to advise better or give referrals, if a product interests him.

Keywords: viral marketing, social network analysis, community web pages, buyer prediction, influence propagation, budget constraints

Procedia PDF Downloads 238
308 Li-Ion Batteries vs. Synthetic Natural Gas: A Life Cycle Analysis Study on Sustainable Mobility

Authors: Guido Lorenzi, Massimo Santarelli, Carlos Augusto Santos Silva

Abstract:

The growth of non-dispatchable renewable energy sources in the European electricity generation mix is promoting the research of technically feasible and cost-effective solutions to make use of the excess energy, produced when the demand is low. The increasing intermittent renewable capacity is becoming a challenge to face especially in Europe, where some countries have shares of wind and solar on the total electricity produced in 2015 higher than 20%, with Denmark around 40%. However, other consumption sectors (mainly transportation) are still considerably relying on fossil fuels, with a slow transition to other forms of energy. Among the opportunities for different mobility concepts, electric (EV) and biofuel-powered vehicles (BPV) are the options that currently appear more promising. The EVs are targeting mainly the light duty users because of their zero (Full electric) or reduced (Hybrid) local emissions, while the BPVs encourage the use of alternative resources with the same technologies (thermal engines) used so far. The batteries which are applied to EVs are based on ions of Lithium because of their overall good performance in energy density, safety, cost and temperature performance. Biofuels, instead, can be various and the major difference is in their physical state (liquid or gaseous). In this study gaseous biofuels are considered and, more specifically, Synthetic Natural Gas (SNG) produced through a process of Power-to-Gas consisting in an electrochemical upgrade (with Solid Oxide Electrolyzers) of biogas with CO2 recycling. The latter process combines a first stage of electrolysis, where syngas is produced, and a second stage of methanation in which the product gas is turned into methane and then made available for consumption. A techno-economic comparison between the two alternatives is possible, but it does not capture all the different aspects involved in the two routes for the promotion of a more sustainable mobility. For this reason, a more comprehensive methodology, i.e. Life Cycle Assessment, is adopted to describe the environmental implications of using excess electricity (directly or indirectly) for new vehicle fleets. The functional unit of the study is 1 km and the two options are compared in terms of overall CO2 emissions, both considering Cradle to Gate and Cradle to Grave boundaries. Showing how production and disposal of materials affect the environmental performance of the analyzed routes is useful to broaden the perspective on the impacts that different technologies produce, in addition to what is emitted during the operational life. In particular, this applies to batteries for which the decommissioning phase has a larger impact on the environmental balance compared to electrolyzers. The lower (more than one order of magnitude) energy density of Li-ion batteries compared to SNG implies that for the same amount of energy used, more material resources are needed to obtain the same effect. The comparison is performed in an energy system that simulates the Western European one, in order to assess which of the two solutions is more suitable to lead the de-fossilization of the transport sector with the least resource depletion and the mildest consequences for the ecosystem.

Keywords: electrical energy storage, electric vehicles, power-to-gas, life cycle assessment

Procedia PDF Downloads 160
307 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data

Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali

Abstract:

The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.

Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors

Procedia PDF Downloads 43
306 Polarization as a Proxy of Misinformation Spreading

Authors: Michela Del Vicario, Walter Quattrociocchi, Antonio Scala, Ana Lucía Schmidt, Fabiana Zollo

Abstract:

Information, rumors, and debates may shape and impact public opinion heavily. In the latest years, several concerns have been expressed about social influence on the Internet and the outcome that online debates might have on real-world processes. Indeed, on online social networks users tend to select information that is coherent to their system of beliefs and to form groups of like-minded people –i.e., echo chambers– where they reinforce and polarize their opinions. In this way, the potential benefits coming from the exposure to different points of view may be reduced dramatically, and individuals' views may become more and more extreme. Such a context fosters misinformation spreading, which has always represented a socio-political and economic risk. The persistence of unsubstantiated rumors –e.g., the hypothetical and hazardous link between vaccines and autism– suggests that social media do have the power to misinform, manipulate, or control public opinion. As an example, current approaches such as debunking efforts or algorithmic-driven solutions based on the reputation of the source seem to prove ineffective against collective superstition. Indeed, experimental evidence shows that confirmatory information gets accepted even when containing deliberately false claims while dissenting information is mainly ignored, influences users’ emotions negatively and may even increase group polarization. Moreover, confirmation bias has been shown to play a pivotal role in information cascades, posing serious warnings about the efficacy of current debunking efforts. Nevertheless, mitigation strategies have to be adopted. To generalize the problem and to better understand social dynamics behind information spreading, in this work we rely on a tight quantitative analysis to investigate the behavior of more than 300M users w.r.t. news consumption on Facebook over a time span of six years (2010-2015). Through a massive analysis on 920 news outlets pages, we are able to characterize the anatomy of news consumption on a global and international scale. We show that users tend to focus on a limited set of pages (selective exposure) eliciting a sharp and polarized community structure among news outlets. Moreover, we find similar patterns around the Brexit –the British referendum to leave the European Union– debate, where we observe the spontaneous emergence of two well segregated and polarized groups of users around news outlets. Our findings provide interesting insights into the determinants of polarization and the evolution of core narratives on online debating. Our main aim is to understand and map the information space on online social media by identifying non-trivial proxies for the early detection of massive informational cascades. Furthermore, by combining users traces, we are finally able to draft the main concepts and beliefs of the core narrative of an echo chamber and its related perceptions.

Keywords: information spreading, misinformation, narratives, online social networks, polarization

Procedia PDF Downloads 270
305 Fuzzy Logic-Based Approach to Predict Fault in Transformer Oil Based on Health Index Using Dissolved Gas Analysis

Authors: Kharisma Utomo Mulyodinoto, Suwarno, Ahmed Abu-Siada

Abstract:

Transformer insulating oil is a key component that can be utilized to detect incipient faults within operating transformers without taking them out of service. Dissolved gas-in-oil analysis has been widely accepted as a powerful technique to detect such incipient faults. While the measurement of dissolved gases within transformer oil samples has been standardized over the past two decades, analysis of the results is not always straightforward as it depends on personnel expertise more than mathematical formulas. In analyzing such data, the generation rate of each dissolved gas is of more concern than the absolute value of the gas. As such, history of dissolved gases within a particular transformer should be archived for future comparison. Lack of such history may lead to misinterpretation of the obtained results. IEEE C57.104-2008 standards have classified the health condition of the transformer based on the absolute value of individual dissolved gases along with the total dissolved combustible gas (TDCG) within transformer oil into 4 conditions. While the technique is easy to implement, it is considered as a very conservative technique and is not widely accepted as a reliable interpretation tool. Moreover, measured gases for the same oil sample can be within various conditions limits and hence, misinterpretation of the data is expected. To overcome this limitation, this paper introduces a fuzzy logic approach to predict the health condition of the transformer oil based on IEEE C57.104-2008 standards along with Roger ratio and IEC ratio-based methods. DGA results of 31 chosen oil samples from 469 transformer oil samples of normal transformers and pre-known fault-type transformers that were collected from Indonesia Electrical Utility Company, PT. PLN (Persero), from different voltage rating: 500/150 kV, 150/20 kV, and 70/20 kV; different capacity: 500 MVA, 60 MVA, 50 MVA, 30 MVA, 20 MVA, 15 MVA, and 10 MVA; and different lifespan, are used to test and establish the fuzzy logic model. Results show that the proposed approach is of good accuracy and can be considered as a platform toward the standardization of the dissolved gas interpretation process.

Keywords: dissolved gas analysis, fuzzy logic, health index, IEEE C57.104-2008, IEC ratio method, Roger ratio method

Procedia PDF Downloads 137
304 Sustainability in Space: Implementation of Circular Economy and Material Efficiency Strategies in Space Missions

Authors: Hamda M. Al-Ali

Abstract:

The ultimate aim of space exploration has been centralized around the possibility of life on other planets in the solar system. This aim is driven by the detrimental effects that climate change could potentially have on human survival on Earth in the future. This drives humans to search for feasible solutions to increase environmental and economical sustainability on Earth and to evaluate and explore the ability of human survival on other planets such as Mars. To do that, frequent space missions are required to meet the ambitious human goals. This means that reliable and affordable access to space is required, which could be largely achieved through the use of reusable spacecrafts. Therefore, materials and resources must be used wisely to meet the increasing demand. Space missions are currently extremely expensive to operate. However, reusing materials hence spacecrafts, can potentially reduce overall mission costs as well as the negative impact on both space and Earth environments. This is because reusing materials leads to less waste generated per mission, and therefore fewer landfill sites are required. Reusing materials reduces resource consumption, material production, and the need for processing new and replacement spacecraft and launch vehicle parts. Consequently, this will ease and facilitate human access to outer space as it will reduce the demand for scarce resources, which will boost material efficiency in the space industry. Material efficiency expresses the extent to which resources are consumed in the production cycle and how the waste produced by the industrial process is minimized. The strategies proposed in this paper to boost material efficiency in the space sector are the introduction of key performance indicators that are able to measure material efficiency as well as the introduction of clearly defined policies and legislation that can be easily implemented within the general practices in the space industry. Another strategy to improve material efficiency is by amplifying energy and resource efficiency through reusing materials. The circularity of various spacecraft materials such as Kevlar, steel, and aluminum alloys could be maximized through reusing them directly or after galvanizing them with another layer of material to act as a protective coat. This research paper has an aim to investigate and discuss how to improve material efficiency in space missions considering circular economy concepts so that space and Earth become more economically and environmentally sustainable. The circular economy is a transition from a make-use-waste linear model to a closed-loop socio-economic model, which is regenerative and restorative in nature. The implementation of a circular economy will reduce waste and pollution through maximizing material efficiency, ensuring that businesses can thrive and sustain. Further research into the extent to which reusable launch vehicles reduce space mission costs have been discussed, along with the environmental and economic implications it could have on the space sector and the environment. This has been examined through research and in-depth literature review of published reports, books, scientific articles, and journals. Keywords such as material efficiency, circular economy, reusable launch vehicles and spacecraft materials were used to search for relevant literature.

Keywords: circular economy, key performance indicator, material efficiency, reusable launch vehicles, spacecraft materials

Procedia PDF Downloads 102
303 Parking Service Effectiveness at Commercial Malls

Authors: Ahmad AlAbdullah, Ali AlQallaf, Mahdi Hussain, Mohammed AlAttar, Salman Ashknani, Magdy Helal

Abstract:

We study the effectiveness of the parking service provided at Kuwaiti commercial malls and explore potential problems and feasible improvements. Commercial malls are important to Kuwaitis as the entertainment and shopping centers due to the lack of other alternatives. The difficulty and relatively long times wasted in finding a parking spot at the mall are real annoyances. We applied queuing analysis to one of the major malls that offer paid-parking (1040 parking spots) in addition to free parking. Patrons of the mall usually complained of the traffic jams and delays at entering the paid parking (average delay to park exceeds 15 min for about 62% of the patrons, while average time spent in the mall is about 2.6 hours). However, the analysis showed acceptable service levels at the check-in gates of the parking garage. Detailed review of the vehicle movement at the gateways indicated that arriving and departing cars both had to share parts of the gateway to the garage, which caused the traffic jams and delays. A simple comparison we made indicated that the largest commercial mall in Kuwait does not suffer such parking issues, while other smaller, yet important malls do, including the one we studied. It was suggested that well-designed inlets and outlets of that gigantic mall permitted smooth parking despite being totally free and mall is the first choice for most people for entertainment and shopping. A simulation model is being developed for further analysis and verification. Simulation can overcome the mathematical difficulty in using non-Poisson queuing models. The simulation model is used to explore potential changes to the parking garage entrance layout. And with the inclusion of the drivers’ behavior inside the parking, effectiveness indicators can be derived to address the economic feasibility of extending the parking capacity and increasing service levels. Outcomes of the study are planned to be generalized as appropriate to other commercial malls in Kuwait

Keywords: commercial malls, parking service, queuing analysis, simulation modeling

Procedia PDF Downloads 322
302 Advertising Campaigns for a Sustainable Future: The Fight against Plastic Pollution in the Ocean

Authors: Mokhlisur Rahman

Abstract:

Ocean inhibits one of the most complex ecosystems on the planet that regulates the earth's climate and weather by providing us with compatible weather to live. Ocean provides food by extending various ways of lifestyles that are dependent on it, transportation by accommodating the world's biggest carriers, recreation by offering its beauty in many moods, and home to countless species. At the essence of receiving various forms of entertainment, consumers choose to be close to the ocean while performing many fun activities. Which, at some point, upsets the stomach of the ocean by threatening marine life and the environment. Consumers throw the waste into the ocean after using it. Most of them are plastics that float over the ocean and turn into thousands of micro pieces that are hard to observe with the naked eye but easily eaten by the sea species. Eventually, that conflicts with the natural consumption process of any living species, making them sick. This information is not known by most consumers who go to the sea or seashores occasionally to spend time, nor is it widely discussed, which creates an information gap among consumers. However, advertising is a powerful tool to educate people about ocean pollution. This abstract analyzes three major ocean-saving advertisement campaigns that use innovative and advanced technology to get maximum exposure. The study collects data from the selected campaigns' websites and retrieves all available content related to messages, videos, and images. First, the SeaLegacy campaign uses stunning images to create awareness among the people; they use social media content, videos, and other educational content. They create content and strategies to build an emotional connection among the consumers that encourage them to move on an action. All the messages in their campaign empower consumers by using powerful words. Second, Ocean Conservancy Campaign uses social media marketing, events, and educational content to protect the ocean from various pollutants, including plastics, climate change, and overfishing. They use powerful images and videos of marine life. Their mission is to create evidence-based solutions toward a healthy ocean. Their message includes the message regarding the local communities along with the sea species. Third, ocean clean-up is a campaign that applies strategies using innovative technologies to remove plastic waste from the ocean. They use social media, digital, and email marketing to reach people and raise awareness. They also use images and videos to evoke an emotional response to take action. These tree advertisements use realistic images, powerful words, and the presence of living species in the imagery presentation, which are eye-catching and can grow emotional connection among the consumers. Identifying the effectiveness of the messages these advertisements carry and their strategies highlights the knowledge gap of mass people between real pollution and its consequences, making the message more accessible to the mass of people. This study aims to provide insights into the effectiveness of ocean-saving advertisement campaigns and their impact on the public's awareness of ocean conservation. The findings from this study help shape future campaigns.

Keywords: advertising-campaign, content-creation, images ocean-saving technology, videos

Procedia PDF Downloads 56
301 Endometrial Ablation and Resection Versus Hysterectomy for Heavy Menstrual Bleeding: A Systematic Review and Meta-Analysis of Effectiveness and Complications

Authors: Iliana Georganta, Clare Deehan, Marysia Thomson, Miriam McDonald, Kerrie McNulty, Anna Strachan, Elizabeth Anderson, Alyaa Mostafa

Abstract:

Context: A meta-analysis of randomized controlled trials (RCTs) comparing hysterectomy versus endometrial ablation and resection in the management of heavy menstrual bleeding. Objective: To evaluate the clinical efficacy, satisfaction rates and adverse events of hysterectomy compared to more minimally invasive techniques in the treatment of HMB. Evidence Acquisition: A literature search was performed for all RCTs and quasi-RCTs comparing hysterectomy with either endometrial ablation endometrial resection of both. The search had no language restrictions and was last updated in June 2020 using MEDLINE, EMBASE, Cochrane Central Register of Clinical Trials, PubMed, Google Scholar, PsycINFO, Clinicaltrials.gov and Clinical trials. EU. In addition, a manual search of the abstract databases of the European Haemophilia Conference on women's health was performed and further studies were identified from references of acquired papers. The primary outcomes were patient-reported and objective reduction in heavy menstrual bleeding up to 2 years and after 2 years. Secondary outcomes included satisfaction rates, pain, adverse events short and long term, quality of life and sexual function, further surgery, duration of surgery and hospital stay and time to return to work and normal activities. Data were analysed using RevMan software. Evidence synthesis: 12 studies and a total of 2028 women were included (hysterectomy: n = 977 women vs endometrial ablation or resection: n = 1051 women). Hysterectomy was compared with endometrial ablation only in five studies (Lin, Dickersin, Sesti, Jain, Cooper) and endometrial resection only in five studies (Gannon, Schulpher, O’Connor, Crosignani, Zupi) and a mixture of the Ablation and Resection in two studies (Elmantwe, Pinion). Of the 1² studies, 10 reported women’s perception of bleeding symptoms as improved. Meta-analysis showed that women in the hysterectomy group were more likely to show improvement in bleeding symptoms when compared with endometrial ablation or resection up to 2-year follow-up (RR 0.75, 95% CI 0.71 to 0.79, I² = 95%). Objective outcomes of improvement in bleeding also favored hysterectomy. Patient satisfaction was higher after hysterectomy within the 2 years follow-up (RR: 0.90, 95%CI: 0.86 to 0.94, I²:58%), however, there was no significant difference between the two groups at more than 2 years follow up. Sepsis (RR: 0.03, 95% CI 0.002 to 0.56; 1 study), wound infection (RR: 0.05, 95% CI: 0.01 to 0.28, I²: 0%, 3 studies) and Urinary tract infection (UTI) (RR: 0.20, 95% CI: 0.10 to 0.42, I²: 0%, 4 studies) all favoured hysteroscopic techniques. Fluid overload (RR: 7.80, 95% CI: 2.16 to 28.16, I² :0%, 4 studies) and perforation (RR: 5.42, 95% CI: 1.25 to 23.45, I²: 0%, 4 studies) however favoured hysterectomy in the short term. Conclusions: This meta-analysis has demonstrated that endometrial ablation and endometrial resection are both viable options when compared with hysterectomy for the treatment of heavy menstrual bleeding. Hysteroscopic procedures had better outcomes in the short term with fewer adverse events including wound infection, UTI and sepsis. The hysterectomy performed better when measuring more long-term impacts such as recurrence of symptoms, overall satisfaction at two years and the need for further treatment or surgery.

Keywords: menorrhagia, hysterectomy, ablation, resection

Procedia PDF Downloads 137
300 Digital Architectural Practice as a Challenge for Digital Architectural Technology Elements in the Era of Digital Design

Authors: Ling Liyun

Abstract:

In the field of contemporary architecture, complex forms of architectural works continue to emerge in the world, along with some new terminology emerged: digital architecture, parametric design, algorithm generation, building information modeling, CNC construction and so on. Architects gradually mastered the new skills of mathematical logic in the form of exploration, virtual simulation, and the entire design and coordination in the construction process. Digital construction technology has a greater degree in controlling construction, and ensure its accuracy, creating a series of new construction techniques. As a result, the use of digital technology is an improvement and expansion of the practice of digital architecture design revolution. We worked by reading and analyzing information about the digital architecture development process, a large number of cases, as well as architectural design and construction as a whole process. Thus current developments were introduced and discussed in our paper, such as architectural discourse, design theory, digital design models and techniques, material selecting, as well as artificial intelligence space design. Our paper also pays attention to the representative three cases of digital design and construction experiment at great length in detail to expound high-informatization, high-reliability intelligence, and high-technique in constructing a humane space to cope with the rapid development of urbanization. We concluded that the opportunities and challenges of the shift existed in architectural paradigms, such as the cooperation methods, theories, models, technologies and techniques which were currently employed in digital design research and digital praxis. We also find out that the innovative use of space can gradually change the way people learn, talk, and control information. The past two decades, digital technology radically breaks the technology constraints of industrial technical products, digests the publicity on a particular architectural style (era doctrine). People should not adapt to the machine, but in turn, it’s better to make the machine work for users.

Keywords: artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction

Procedia PDF Downloads 117
299 Price Compensation Mechanism with Unmet Demand for Public-Private Partnership Projects

Authors: Zhuo Feng, Ying Gao

Abstract:

Public-private partnership (PPP), as an innovative way to provide infrastructures by the private sector, is being widely used throughout the world. Compared with the traditional mode, PPP emerges largely for merits of relieving public budget constraint and improving infrastructure supply efficiency by involving private funds. However, PPP projects are characterized by large scale, high investment, long payback period, and long concession period. These characteristics make PPP projects full of risks. One of the most important risks faced by the private sector is demand risk because many factors affect the real demand. If the real demand is far lower than the forecasting demand, the private sector will be got into big trouble because operating revenue is the main means for the private sector to recoup the investment and obtain profit. Therefore, it is important to study how the government compensates the private sector when the demand risk occurs in order to achieve Pareto-improvement. This research focuses on price compensation mechanism, an ex-post compensation mechanism, and analyzes, by mathematical modeling, the impact of price compensation mechanism on payoff of the private sector and consumer surplus for PPP toll road projects. This research first investigates whether or not price compensation mechanisms can obtain Pareto-improvement and, if so, then explores boundary conditions for this mechanism. The research results show that price compensation mechanism can realize Pareto-improvement under certain conditions. Especially, to make the price compensation mechanism accomplish Pareto-improvement, renegotiation costs of the government and the private sector should be lower than a certain threshold which is determined by marginal operating cost and distortionary cost of the tax. In addition, the compensation percentage should match with the price cut of the private investor when demand drops. This research aims to provide theoretical support for the government when determining compensation scope under the price compensation mechanism. Moreover, some policy implications can also be drawn from the analysis for better risk-sharing and sustainability of PPP projects.

Keywords: infrastructure, price compensation mechanism, public-private partnership, renegotiation

Procedia PDF Downloads 155
298 Development of an Improved Paradigm for the Tourism Sector in the Department of Huila, Colombia: A Theoretical and Empirical Approach

Authors: Laura N. Bolivar T.

Abstract:

The tourism importance for regional development is mainly highlighted by the collaborative, cooperating and competitive relationships of the involved agents. The fostering of associativity processes, in particular, the cluster approach emphasizes the beneficial outcomes from the concentration of enterprises, where innovation and entrepreneurship flourish and shape the dynamics for tourism empowerment. Considering the department of Huila, it is located in the south-west of Colombia and holds the biggest coffee production in the country, although it barely contributes to the national GDP. Hence, its economic development strategy is looking for more dynamism and Huila could be consolidated as a leading destination for cultural, ecological and heritage tourism, if at least the public policy making processes for the tourism management of La Tatacoa Desert, San Agustin Park and Bambuco’s National Festival, were implemented in a more efficient manner. In this order of ideas, this study attempts to address the potential restrictions and beneficial factors for the consolidation of the tourism sector of Huila-Colombia as a cluster and how could it impact its regional development. Therefore, a set of theoretical frameworks such as the Tourism Routes Approach, the Tourism Breeding Environment, the Community-based Tourism Method, among others, but also a collection of international experiences describing tourism clustering processes and most outstanding problematics, is analyzed to draw up learning points, structure of proceedings and success-driven factors to be contrasted with the local characteristics in Huila, as the region under study. This characterization involves primary and secondary information collection methods and comprises the South American and Colombian context together with the identification of involved actors and their roles, main interactions among them, major tourism products and their infrastructure, the visitors’ perspective on the situation and a recap of the related needs and benefits regarding the host community. Considering the umbrella concepts, the theoretical and the empirical approaches, and their comparison with the local specificities of the tourism sector in Huila, an array of shortcomings is analytically constructed and a series of guidelines are proposed as a way to overcome them and simultaneously, raise economic development and positively impact Huila’s well-being. This non-exhaustive bundle of guidelines is focused on fostering cooperating linkages in the actors’ network, dealing with Information and Communication Technologies’ innovations, reinforcing the supporting infrastructure, promoting the destinations considering the less known places as well, designing an information system enabling the tourism network to assess the situation based on reliable data, increasing competitiveness, developing participative public policy-making processes and empowering the host community about the touristic richness. According to this, cluster dynamics would drive the tourism sector to meet articulation and joint effort, then involved agents and local particularities would be adequately assisted to cope with the current changing environment of globalization and competition.

Keywords: innovative strategy, local development, network of tourism actors, tourism cluster

Procedia PDF Downloads 123
297 Towards Visual Personality Questionnaires Based on Deep Learning and Social Media

Authors: Pau Rodriguez, Jordi Gonzalez, Josep M. Gonfaus, Xavier Roca

Abstract:

Image sharing in social networks has increased exponentially in the past years. Officially, there are 600 million Instagrammers uploading around 100 million photos and videos per day. Consequently, there is a need for developing new tools to understand the content expressed in shared images, which will greatly benefit social media communication and will enable broad and promising applications in education, advertisement, entertainment, and also psychology. Following these trends, our work aims to take advantage of the existing relationship between text and personality, already demonstrated by multiple researchers, so that we can prove that there exists a relationship between images and personality as well. To achieve this goal, we consider that images posted on social networks are typically conditioned on specific words, or hashtags, therefore any relationship between text and personality can also be observed with those posted images. Our proposal makes use of the most recent image understanding models based on neural networks to process the vast amount of data generated by social users to determine those images most correlated with personality traits. The final aim is to train a weakly-supervised image-based model for personality assessment that can be used even when textual data is not available, which is an increasing trend. The procedure is described next: we explore the images directly publicly shared by users based on those accompanying texts or hashtags most strongly related to personality traits as described by the OCEAN model. These images will be used for personality prediction since they have the potential to convey more complex ideas, concepts, and emotions. As a result, the use of images in personality questionnaires will provide a deeper understanding of respondents than through words alone. In other words, from the images posted with specific tags, we train a deep learning model based on neural networks, that learns to extract a personality representation from a picture and use it to automatically find the personality that best explains such a picture. Subsequently, a deep neural network model is learned from thousands of images associated with hashtags correlated to OCEAN traits. We then analyze the network activations to identify those pictures that maximally activate the neurons: the most characteristic visual features per personality trait will thus emerge since the filters of the convolutional layers of the neural model are learned to be optimally activated depending on each personality trait. For example, among the pictures that maximally activate the high Openness trait, we can see pictures of books, the moon, and the sky. For high Conscientiousness, most of the images are photographs of food, especially healthy food. The high Extraversion output is mostly activated by pictures of a lot of people. In high Agreeableness images, we mostly see flower pictures. Lastly, in the Neuroticism trait, we observe that the high score is maximally activated by animal pets like cats or dogs. In summary, despite the huge intra-class and inter-class variabilities of the images associated to each OCEAN traits, we found that there are consistencies between visual patterns of those images whose hashtags are most correlated to each trait.

Keywords: emotions and effects of mood, social impact theory in social psychology, social influence, social structure and social networks

Procedia PDF Downloads 172
296 The Senior Traveler Market as a Competitive Advantage for the Luxury Hotel Sector in the UK Post-Pandemic

Authors: Feyi Olorunshola

Abstract:

Over the last few years, the senior travel market has been noted for its potential in the wider tourism industry. The tourism sector includes the hotel and hospitality, travel, transportation, and several other subdivisions to make it economically viable. In particular, the hotel attracts a substantial part of the expenditure in tourism activities as when people plan to travel, suitable accommodation for relaxation, dining, entertainment and so on is paramount to their decision-making. The global retail value of the hotel as of 2018 was significant for tourism. But, despite indications of the hotel to the tourism industry at large, very few empirical studies are available to establish how this sector can leverage on the senior demographic to achieve competitive advantage. Predominantly, studies on the mature market have focused on destination tourism, with a limited investigation on the hotel which makes a significant contribution to tourism. Also, several scholarly studies have demonstrated the importance of the senior travel market to the hotel, yet there is very little empirical research in the field which has explored the driving factors that will become the accepted new normal for this niche segment post-pandemic. Giving that the hotel already operates in a highly saturated business environment, and on top of this pre-existing challenge, the ongoing global health outbreak has further put the sector in a vulnerable position. Therefore, the hotel especially the full-service luxury category must evolve rapidly for it to survive in the current business environment. The hotel can no longer rely on corporate travelers to generate higher revenue since the unprecedented wake of the pandemic in 2020 many organizations have invented a different approach of conducting their businesses online, therefore, the hotel needs to anticipate a significant drop in business travellers. However, the rooms and the rest of the facilities must be occupied to keep their business operating. The way forward for the hotel lies in the leisure sector, but the question now is to focus on the potential demographics of travelers, in this case, the seniors who have been repeatedly recognized as the lucrative market because of increase discretionary income, availability of time and the global population trends. To achieve the study objectives, a mixed-method approach will be utilized drawing on both qualitative (netnography) and quantitative (survey) methods, cognitive and decision-making theories (means-end chain) and competitive theories to identify the salient drivers explaining senior hotel choice and its influence on their decision-making. The target population are repeated seniors’ age 65 years and over who are UK resident, and from the top tourist market to the UK (USA, Germany, and France). Structural equation modelling will be employed to analyze the datasets. The theoretical implication is the development of new concepts using a robust research design, and as well as advancing existing framework to hotel study. Practically, it will provide the hotel management with the latest information to design a competitive marketing strategy and activities to target the mature market post-pandemic and over a long period.

Keywords: competitive advantage, covid-19, full-service hotel, five-star, luxury hotels

Procedia PDF Downloads 106
295 Re-Framing Resilience Turn in Risk and Management with Anti-Positivistic Perspective of Holling's Early Work

Authors: Jose CanIzares

Abstract:

In the last decades, resilience has received much attention in relation to understanding and managing new forms of risk, especially in the context of urban adaptation to climate change. There are abundant concerns, however, on how to best interpret resilience and related ideas, and on whether they can guide ethically appropriate risk-related or adaptation efforts. Narrative creation and framing are critical steps in shaping public discussion and policy in large-scale interventions, since they favor or inhibit early decision and interpretation habits, which can be morally sensitive and then become persistent on time. This article adds to such framing process by contesting a conventional narrative on resilience and offering an alternative one. Conventionally, present ideas on resilience are traced to the work of ecologist C. S. Holling, especially to his article Resilience and Stability in Ecosystems. This article is usually portrayed as a contribution of complex systems thinking to theoretical ecology, where Holling appeals to resilience in order to challenge received views on ecosystem stability and the diversity-stability hypothesis. In this regard, resilience is construed as a “purely scientific”, precise and descriptive concept, denoting a complex property that allows ecosystems to persist, or to maintain functions, after disturbance. Yet, these formal features of resilience supposedly changed with Holling’s later work in the 90s, where, it is argued, Holling begun to use resilience as a more pragmatic “boundary term”, aimed at unifying transdisciplinary research about risks, ecological or otherwise, and at articulating public debate and governance strategies on the issue. In the conventional story, increased vagueness and degrees of normativity are the price to pay for this conceptual shift, which has made the term more widely usable, but also incompatible with scientific purposes and morally problematic (if not completely objectionable). This paper builds on a detailed analysis of Holling’s early work to propose an alternative narrative. The study will show that the “complexity turn” has often entangled theoretical and pragmatic aims. Accordingly, Holling’s primary aim was to fight what he termed “pathologies of natural resource management” or “pathologies of command and control management”, and so, the terms of his reform of ecosystem science are partly subordinate to the details of his proposal for reforming the management sciences. As regards resilience, Holling used it as a polysemous, ambiguous and normative term: sometimes, as an instrumental value that is closely related to various stability concepts; other times, and more crucially, as an intrinsic value and a tool for attacking efficiency and instrumentalism in management. This narrative reveals the limitations of its conventional alternative and has several practical advantages. It captures well the structure and purposes of Holling’s project, and the various roles of resilience in it. It helps to link Holling’s early work with other philosophical and ideological shifts at work in the 70s. It highlights the currency of Holling’s early work for present research and action in fields such as risk and climate adaptation. And it draws attention to morally relevant aspects of resilience that the conventional narrative neglects.

Keywords: resilience, complexity turn, risk management, positivistic, framing

Procedia PDF Downloads 143
294 Metal Extraction into Ionic Liquids and Hydrophobic Deep Eutectic Mixtures

Authors: E. E. Tereshatov, M. Yu. Boltoeva, V. Mazan, M. F. Volia, C. M. Folden III

Abstract:

Room temperature ionic liquids (RTILs) are a class of liquid organic salts with melting points below 20 °C that are considered to be environmentally friendly ‘designers’ solvents. Pure hydrophobic ILs are known to extract metallic species from aqueous solutions. The closest analogues of ionic liquids are deep eutectic solvents (DESs), which are a eutectic mixture of at least two compounds with a melting point lower than that of each individual component. DESs are acknowledged to be attractive for organic synthesis and metal processing. Thus, these non-volatile and less toxic compounds are of interest for critical metal extraction. The US Department of Energy and the European Commission consider indium as a key metal. Its chemical homologue, thallium, is also an important material for some applications and environmental safety. The aim of this work is to systematically investigate In and Tl extraction from aqueous solutions into pure fluorinated ILs and hydrophobic DESs. The dependence of the Tl extraction efficiency on the structure and composition of the ionic liquid ions, metal oxidation state, and initial metal and aqueous acid concentrations have been studied. The extraction efficiency of the TlXz3–z anionic species (where X = Cl– and/or Br–) is greater for ionic liquids with more hydrophobic cations. Unexpectedly high distribution ratios (> 103) of Tl(III) were determined even by applying a pure ionic liquid as receiving phase. An improved mathematical model based on ion exchange and ion pair formation mechanisms has been developed to describe the co-extraction of two different anionic species, and the relative contributions of each mechanism have been determined. The first evidence of indium extraction into new quaternary ammonium- and menthol-based hydrophobic DESs from hydrochloric and oxalic acid solutions with distribution ratios up to 103 will be provided. Data obtained allow us to interpret the mechanism of thallium and indium extraction into ILs and DESs media. The understanding of Tl and In chemical behavior in these new media is imperative for the further improvement of separation and purification of these elements.

Keywords: deep eutectic solvents, indium, ionic liquids, thallium

Procedia PDF Downloads 220
293 Cross-cultural Training in International Cooperation Efforts

Authors: Shawn Baker-Garcia, Janna O. Schaeffer

Abstract:

As the global and national communities and governments strive to address ongoing and evolving threats to humanity and pervasive or emerging “shared” global priorities on environmental, economic, political, and security, it is more urgent than ever before to understand each other, communicate effectively with one another, identify models of cooperation that yield improved, mutually reinforcing outcomes across and within cultures. It is within the backdrop of this reality that the presentation examines whether cultural training as we have approached it in recent decades is sufficiently meeting our current needs and what changes may be applied to foster better and more productive and sustainable intercultural interactions. Domestic and global relations face multiple challenges to peaceable cooperation. The last two years, in particular, have been defined by a travel-restricted COVID-19 pandemic yielding increased intercultural interactions over virtual platforms, polarized politics dividing nations and regions, and the commensurate rise in weaponized social and traditional media communication. These societal and cultural fissures are noticeably challenging our collective and individual abilities to constructively interact both at home and abroad. It is within this pressure cooker environment that the authors believe it is time to reexamine existing and broadly accepted inter- and cross- cultural training approaches and concepts to determine their level of effectiveness in setting conditions for optimal human understanding and relationships both in the national and international context. In order to better understand the amount and the type of intercultural training practitioners professionally engaging in international partnership building have received throughout their careers and its perceived effectiveness, a survey was designed and distributed to US and international professionals presently engaged in the fields of diplomacy, military, academia, and international business. The survey questions were deigned to address the two primary research questions investigators posed in this exploratory study. Research questions aimed to examine practitioners’ view of the role and effectiveness of current and traditional cultural training and education as a means to fostering improved communication, interactions, understanding, and cooperation among inter, cross, or multi-cultural communities or efforts.Responses were then collected and analyzed for themes present in the participants’ reflections. In their responses, the practitioners identified the areas of improvement and desired outcomes in regards to intercultural training and awareness raising curricular approaches. They also raised issues directly and indirectly pertaining to the role of foreign language proficiency in intercultural interactions and a need for a solid grasp on cultural and regional issues (regional expertise) to facilitate such an interaction. Respondents indicated knowledge, skills, abilities, and capabilities that the participants were not trained on but learned through ad hoc personal and professional intercultural interactions, which they found most valuable and wished they had acquired prior to the intercultural experience.

Keywords: cultural training, improved communication, intercultural competence, international cooperation

Procedia PDF Downloads 109
292 Psychogeographic Analysis of Spatial Appropriation within Walking Practice: The City Centre versus University Campus in the Case of Van, Turkey

Authors: Yasemin Ilkay

Abstract:

Urban spatial pattern interacts with the minds and bodies of citizens and influences their perception and attitudes, which leads to a two-folded map of the same space: physical and Psychogeographic maps. Psychogeography is a field of inquiry (rooted in literature and fiction) investigating how the environment affects the feelings and behaviors of individuals. This term was posed by Situationist International Movement in the 1950s by Guy Debord; in the course of time, the artistic framework evolved into a political issue, especially with the term Dérive, which indicates ‘deviation’ and ‘resistance’ to the existing spatial reality. The term Dérive appeared on the track of Flânéur after one hundred years; and turned out to be a political tool to transform everyday urban life. The three main concepts of psychogeography [walking, dérive, and palimpsest] construct the epistemological framework for a psychogeographic spatial analysis. Mental representations investigating this framework would provide a designer to capture the invisible layers of the gap between ‘how a space is conceived’ and ‘how the same space is perceived and experienced.’ This gap is a neglected but critical issue to discuss in the planning discipline, and psychogeography provides methodological inputs to cover the interrelation among top-down designs of urban patterning and bottom-up reproductions of ‘the soul’ of urban space at the intersection of geography and psychology. City centers and university campuses exemplify opposite poles of spatial organization and walking practice, which may result in differentiated spatial appropriation forms. There is a traditional city center in Van, located at the core of the city with a dense population and several activities, but not connected to Van Lake, which is the largest lake in the country. On the other hand, the university campus is located at the periphery, and although it has a promenade along the lake’s coast and a regional hospital, it presents a limited walking experience with ambiguous forms of spatial appropriation. The city center draws a vivid urban everyday life; however, the campus presents a relatively natural life far away from the center. This paper aims to reveal the differentiated psychogeographic maps of spatial appropriation at the city center vs. the university campus, which is located at the periphery of the city and along the coast of the largest lake in Turkey. The main question of the paper is, “how do the psychogeographic maps of spatial appropriation differentiate at the city center and university campus in Van within the walking experience with reference to the two-folded map assumption.” The experiential maps of a core group of 15 planning students will be created with the techniques of mental mapping, photographing, and narratives through attentive walks conducted together on selected routes; in addition to these attentive walks, 30 more in-depth interviews will be conducted by the core group. The narrative of psychogeographic mapping of spatial appropriation at the two spatial poles would display the conflicting soul of the city with reference to sub-behavioural regions of walking, differentiated forms of derive and layers of palimpsest.

Keywords: attentive walk, body, cognitive geography, derive, experiential maps, psychogeography, Van, Turkey

Procedia PDF Downloads 58
291 Disaster Management Supported by Unmanned Aerial Systems

Authors: Agoston Restas

Abstract:

Introduction: This paper describes many initiatives and shows also practical examples which happened recently using Unmanned Aerial Systems (UAS) to support disaster management. Since the operation of manned aircraft at disasters is usually not only expensive but often impossible to use as well, in many cases managers fail to use the aerial activity. UAS can be an alternative moreover cost-effective solution for supporting disaster management. Methods: This article uses thematic division of UAS applications; it is based on two key elements, one of them is the time flow of managing disasters, other is its tactical requirements. Logically UAS can be used like pre-disaster activity, activity immediately after the occurrence of a disaster and the activity after the primary disaster elimination. Paper faces different disasters, like dangerous material releases, floods, earthquakes, forest fires and human-induced disasters. Research used function analysis, practical experiments, mathematical formulas, economic analysis and also expert estimation. Author gathered international examples and used own experiences in this field as well. Results and discussion: An earthquake is a rapid escalating disaster, where, many times, there is no other way for a rapid damage assessment than aerial reconnaissance. For special rescue teams, the UAS application can help much in a rapid location selection, where enough place remained to survive for victims. Floods are typical for a slow onset disaster. In contrast, managing floods is a very complex and difficult task. It requires continuous monitoring of dykes, flooded and threatened areas. UAS can help managers largely keeping an area under observation. Forest fires are disasters, where the tactical application of UAS is already well developed. It can be used for fire detection, intervention monitoring and also for post-fire monitoring. In case of nuclear accident or hazardous material leakage, UAS is also a very effective or can be the only one tool for supporting disaster management. Paper shows some efforts using UAS to avoid human-induced disasters in low-income countries as part of health cooperation.

Keywords: disaster management, floods, forest fires, Unmanned Aerial Systems

Procedia PDF Downloads 209
290 Study of the Adsorptives Properties of Zeolites X Exchanged by the Cations Cu2 + and/or Zn2+

Authors: H. Hammoudi, S. Bendenia, I. Batonneau-Gener, A. Khelifa

Abstract:

Applying growing zeolites is due to their intrinsic physicochemical properties: a porous structure, regular, generating a large free volume, a high specific surface area, acidic properties of interest to the origin of their activity, selectivity energy and dimensional, leading to a screening phenomenon, hence the name of molecular sieves is generally attributed to them. Most of the special properties of zeolites have been valued as direct applications such as ion exchange, adsorption, separation and catalysis. Due to their crystalline structure stable, their large pore volume and their high content of cation X zeolites are widely used in the process of adsorption and separation. The acidic properties of zeolites X and interesting selectivity conferred on them their porous structure is also have potential catalysts. The study presented in this manuscript is devoted to the chemical modification of an X zeolite by cation exchange. Ion exchange of zeolite NaX by Zn 2 + cations and / or Cu 2 + is gradually conducted by following the evolution of some of its characteristics: crystallinity by XRD, micropore volume by nitrogen adsorption. Once characterized, the different samples will be used for the adsorption of propane and propylene. Particular attention is paid thereafter, on the modeling of adsorption isotherms. In this vein, various equations of adsorption isotherms and localized mobile, some taking into account the adsorbate-adsorbate interactions, are used to describe the experimental isotherms. We also used the Toth equation, a mathematical model with three parameters whose adjustment requires nonlinear regression. The last part is dedicated to the study of acid properties of Cu (x) X, Zn (x) X and CuZn (x) X, with the adsorption-desorption of pyridine followed by IR. The effect of substitution at different rates of Na + by Cu2 + cations and / or Zn 2 +, on the crystallinity and on the textural properties was treated. Some results on the morphology of the crystallites and the thermal effects during a temperature rise, obtained by scanning electron microscopy and DTA-TGA thermal analyzer, respectively, are also reported. The acidity of our different samples was also studied. Thus, the nature and strength of each type of acidity are estimated. The evaluation of these various features will provide a comparison between Cu (x) X, Zn (x) X and CuZn (x) X. One study on adsorption of C3H8 and C3H6 in NaX, Cu (x) X , Zn (x) x and CuZn (x) x has been undertaken.

Keywords: adsorption, acidity, ion exchange, zeolite

Procedia PDF Downloads 179
289 Study of Properties of Concretes Made of Local Building Materials and Containing Admixtures, and Their Further Introduction in Construction Operations and Road Building

Authors: Iuri Salukvadze

Abstract:

Development of Georgian Economy largely depends on its effective use of its transit country potential. The value of Georgia as the part of Europe-Asia corridor has increased; this increases the interest of western and eastern countries to Georgia as to the country that laid on the transit axes that implies transit infrastructure creation and development in Georgia. It is important to use compacted concrete with the additive in modern road construction industry. Even in the 21-century, concrete remains as the main vital constructive building material, therefore innovative, economic and environmentally protected technologies are needed. Georgian construction market requires the use of concrete of new generation, adaptation of nanotechnologies to the local realities that will give the ability to create multifunctional, nano-technological high effective materials. It is highly important to research their physical and mechanical states. The study of compacted concrete with the additives is necessary to use in the road construction in the future and to increase hardness of roads in Georgia. The aim of the research is to study the physical-mechanical properties of the compacted concrete with the additives based on the local materials. Any experimental study needs large number of experiments from one side in order to achieve high accuracy and optimal number of the experiments with minimal charges and in the shortest period of time from the other side. To solve this problem in practice, it is possible to use experiments planning static and mathematical methods. For the materials properties research we will use distribution hypothesis, measurements results by normal law according to which divergence of the obtained results is caused by the error of method and inhomogeneity of the object. As the result of the study, we will get resistible compacted concrete with additives for the motor roads that will improve roads infrastructure and give us saving rate while construction of the roads and their exploitation.

Keywords: construction, seismic protection systems, soil, motor roads, concrete

Procedia PDF Downloads 216
288 Designing Disaster Resilience Research in Partnership with an Indigenous Community

Authors: Suzanne Phibbs, Christine Kenney, Robyn Richardson

Abstract:

The Sendai Framework for Disaster Risk Reduction called for the inclusion of indigenous people in the design and implementation of all hazard policies, plans, and standards. Ensuring that indigenous knowledge practices were included alongside scientific knowledge about disaster risk was also a key priority. Indigenous communities have specific knowledge about climate and natural hazard risk that has been developed over an extended period of time. However, research within indigenous communities can be fraught with issues such as power imbalances between the researcher and researched, the privileging of researcher agendas over community aspirations, as well as appropriation and/or inappropriate use of indigenous knowledge. This paper documents the process of working alongside a Māori community to develop a successful community-led research project. Research Design: This case study documents the development of a qualitative community-led participatory project. The community research project utilizes a kaupapa Māori research methodology which draws upon Māori research principles and concepts in order to generate knowledge about Māori resilience. The research addresses a significant gap in the disaster research literature relating to indigenous knowledge about collective hazard mitigation practices as well as resilience in rurally isolated indigenous communities. The research was designed in partnership with the Ngāti Raukawa Northern Marae Collective as well as Ngā Wairiki Ngāti Apa (a group of Māori sub-tribes who are located in the same region) and will be conducted by Māori researchers utilizing Māori values and cultural practices. The research project aims and objectives, for example, are based on themes that were identified as important to the Māori community research partners. The research methodology and methods were also negotiated with and approved by the community. Kaumātua (Māori elders) provided cultural and ethical guidance over the proposed research process and will continue to provide oversight over the conduct of the research. Purposive participant recruitment will be facilitated with support from local Māori community research partners, utilizing collective marae networks and snowballing methods. It is envisaged that Māori participants’ knowledge, experiences and views will be explored using face-to-face communication research methods such as workshops, focus groups and/or semi-structured interviews. Interviews or focus groups may be held in English and/or Te Reo (Māori language) to enhance knowledge capture. Analysis, knowledge dissemination, and co-authorship of publications will be negotiated with the Māori community research partners. Māori knowledge shared during the research will constitute participants’ intellectual property. New knowledge, theory, frameworks, and practices developed by the research will be co-owned by Māori, the researchers, and the host academic institution. Conclusion: An emphasis on indigenous knowledge systems within the Sendai Framework for Disaster Risk Reduction risks the appropriation and misuse of indigenous experiences of disaster risk identification, mitigation, and response. The research protocol underpinning this project provides an exemplar of collaborative partnership in the development and implementation of an indigenous project that has relevance to policymakers, academic researchers, other regions with indigenous communities and/or local disaster risk reduction knowledge practices.

Keywords: community resilience, indigenous disaster risk reduction, Maori, research methods

Procedia PDF Downloads 106
287 An Evaluation of the Auxiliary Instructional App Amid Learning Chinese Characters for Children with Specific Learning Disorders

Authors: Chieh-Ning Lan, Tzu-Shin Lin, Kun-Hao Lin

Abstract:

Chinese handwriting skill is one of the basic skills of school-age children in Taiwan, which helps them to learn most academic subjects. Differ from the alphabetic language system, Chinese written language is a logographic script with a complicated 2-dimensional character structure as a morpheme. Visuospatial ability places a great role in Chinese handwriting to maintain good proportion and alignment of these interwoven strokes. In Taiwan, school-age students faced the challenge to recognize and write down Chinese characters, especially in children with written expression difficulties (CWWDs). In this study, we developed an instructional app to help CWWDs practice Chinese handwriting skills, and we aimed to apply the mobile assisted language learning (MALL) system in clinical writing strategies. To understand the feasibility and satisfaction of this auxiliary instructional writing app, we investigated the perceive and value both from school-age students and the clinic therapists, who were the target users and the experts. A group of 8 elementary school children, as well as 8 clinic therapists, were recruited. The school-age students were asked to go through a paper-based instruction and were asked to score the visual expression based on their graphic preference; the clinic therapists were asked to watch an introductive video of this instructional app and complete the online formative questionnaire. In the results of our study, from the perspective of user interface design, school-age students were more attracted to cartoon-liked pictures rather than line drawings or vivid photos. Moreover, compared to text, pictures which have higher semantic transparency were more commonly chosen by children. In terms of the quantitative survey from clinic therapists, they were highly satisfied with this auxiliary instructional writing app, including the concepts such as visual design, teaching contents, and positive reinforcement system. Furthermore, the qualitative results also suggested comprehensive positive feedbacks on the teaching contents and the feasibility of integrating the app into clinical treatments. Interestingly, we found that clinic therapists showed high agreement in approving CWWDs’ writing ability with using orthographic knowledge; however, in the qualitative section, clinic therapists pointed out that CWWDs usually have relative insufficient background knowledge in Chinese character orthographic rules, which because it is not a key-point in conventional handwriting instruction. Also, previous studies indicated that conventional Chinese reading and writing instructions were lacked of utilizing visual-spatial arrangement strategies. Based on the sharing experiences from all participants, we concluded several interesting topics that are worth to dedicate to in the future. In this undergoing app system, improvement and revision will be applied into the system design, and will establish a better and more useful instructional system for CWWDs within their treatments; enlightened by the opinions related to learning content, the importance of orthographic knowledge in Chinese character recognition should be well discussed and involved in CWWDs’ intervention in the future.

Keywords: auxiliary instructional app, children with writing difficulties, Chinese handwriting, orthographic knowledge

Procedia PDF Downloads 153
286 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging

Procedia PDF Downloads 135
285 Reduced General Dispersion Model in Cylindrical Coordinates and Isotope Transient Kinetic Analysis in Laminar Flow

Authors: Masood Otarod, Ronald M. Supkowski

Abstract:

This abstract discusses a method that reduces the general dispersion model in cylindrical coordinates to a second order linear ordinary differential equation with constant coefficients so that it can be utilized to conduct kinetic studies in packed bed tubular catalytic reactors at a broad range of Reynolds numbers. The model was tested by 13CO isotope transient tracing of the CO adsorption of Boudouard reaction in a differential reactor at an average Reynolds number of 0.2 over Pd-Al2O3 catalyst. Detailed experimental results have provided evidence for the validity of the theoretical framing of the model and the estimated parameters are consistent with the literature. The solution of the general dispersion model requires the knowledge of the radial distribution of axial velocity. This is not always known. Hence, up until now, the implementation of the dispersion model has been largely restricted to the plug-flow regime. But, ideal plug-flow is impossible to achieve and flow regimes approximating plug-flow leave much room for debate as to the validity of the results. The reduction of the general dispersion model transpires as a result of the application of a factorization theorem. Factorization theorem is derived from the observation that a cross section of a catalytic bed consists of a solid phase across which the reaction takes place and a void or porous phase across which no significant measure of reaction occurs. The disparity in flow and the heterogeneity of the catalytic bed cause the concentration of reacting compounds to fluctuate radially. These variabilities signify the existence of radial positions at which the radial gradient of concentration is zero. Succinctly, factorization theorem states that a concentration function of axial and radial coordinates in a catalytic bed is factorable as the product of the mean radial cup-mixing function and a contingent dimensionless function. The concentration of adsorbed compounds are also factorable since they are piecewise continuous functions and suffer the same variability but in the reverse order of the concentration of mobile phase compounds. Factorability is a property of packed beds which transforms the general dispersion model to an equation in terms of the measurable mean radial cup-mixing concentration of the mobile phase compounds and mean cross-sectional concentration of adsorbed species. The reduced model does not require the knowledge of the radial distribution of the axial velocity. Instead, it is characterized by new transport parameters so denoted by Ωc, Ωa, Ωc, and which are respectively denominated convection coefficient cofactor, axial dispersion coefficient cofactor, and radial dispersion coefficient cofactor. These cofactors adjust the dispersion equation as compensation for the unavailability of the radial distribution of the axial velocity. Together with the rest of the kinetic parameters they can be determined from experimental data via an optimization procedure. Our data showed that the estimated parameters Ωc, Ωa Ωr, are monotonically correlated with the Reynolds number. This is expected to be the case based on the theoretical construct of the model. Computer generated simulations of methanation reaction on nickel provide additional support for the utility of the newly conceptualized dispersion model.

Keywords: factorization, general dispersion model, isotope transient kinetic, partial differential equations

Procedia PDF Downloads 246
284 Transmission Dynamics of Lumpy Skin Disease in Ethiopia

Authors: Wassie Molla, Klaas Frankena, Mart De Jong

Abstract:

Lumpy skin disease (LSD) is a severe viral disease of cattle, which often occurs in epidemic form. It is caused by lumpy skin disease virus of the genus capripoxvirus of family poxviridae. Mathematical models play important role in the study of infectious diseases epidemiology. They help to explain the dynamics and understand the transmission of an infectious disease within a population. Understanding the transmission dynamics of lumpy skin disease between animals is important for the implementation of effective prevention and control measures against the disease. This study was carried out in central and north-western part of Ethiopia with the objectives to understand LSD outbreak dynamics, quantify the transmission between animals and herds, and estimate the disease reproduction ratio in dominantly crop-livestock mixed and commercial herd types. Field observation and follow-up study were undertaken, and the transmission parameters were estimated based on a SIR epidemic model in which individuals are susceptible (S), infected and infectious (I), and recovered and immune or dead (R) using the final size and generalized linear model methods. The result showed that a higher morbidity was recorded in infected crop-livestock (24.1%) mixed production system herds than infected commercial production (17.5%) system herds whereas mortality was higher in intensive (4.0%) than crop-livestock (1.5%) system and the differences were statistically significant. The transmission rate among animals and between herds were 0.75 and 0.68 per week, respectively in dominantly crop-livestock production system. The transmission study undertaken in dominantly crop-livestock production system highlighted the presence of statistically significant seasonal difference in LSD transmission among animals. The reproduction numbers of LSD in dominantly crop-livestock production system were 1.06 among animals and 1.28 between herds whereas it varies from 1.03 to 1.31 among animals in commercial production system. Though the R estimated for LSD in different production systems at different localities is greater than 1, its magnitude is low implying that the disease can be easily controlled by implementing the appropriate control measures.

Keywords: commercial, crop-livestock, Ethiopia, LSD, reproduction number, transmission

Procedia PDF Downloads 273
283 Participatory Monitoring Strategy to Address Stakeholder Engagement Impact in Co-creation of NBS Related Project: The OPERANDUM Case

Authors: Teresa Carlone, Matteo Mannocchi

Abstract:

In the last decade, a growing number of International Organizations are pushing toward green solutions for adaptation to climate change. This is particularly true in the field of Disaster Risk Reduction (DRR) and land planning, where Nature-Based Solutions (NBS) had been sponsored through funding programs and planning tools. Stakeholder engagement and co-creation of NBS is growing as a practice and research field in environmental projects, fostering the consolidation of a multidisciplinary socio-ecological approach in addressing hydro-meteorological risk. Even thou research and financial interests are constantly spread, the NBS mainstreaming process is still at an early stage as innovative concepts and practices make it difficult to be fully accepted and adopted by a multitude of different actors to produce wide scale societal change. The monitoring and impact evaluation of stakeholders’ participation in these processes represent a crucial aspect and should be seen as a continuous and integral element of the co-creation approach. However, setting up a fit for purpose-monitoring strategy for different contexts is not an easy task, and multiple challenges emerge. In this scenario, the Horizon 2020 OPERANDUM project, designed to address the major hydro-meteorological risks that negatively affect European rural and natural territories through the co-design, co-deployment, and assessment of Nature-based Solution, represents a valid case study to test a monitoring strategy from which set a broader, general and scalable monitoring framework. Applying a participative monitoring methodology, based on selected indicators list that combines quantitative and qualitative data developed within the activity of the project, the paper proposes an experimental in-depth analysis of the stakeholder engagement impact in the co-creation process of NBS. The main focus will be to spot and analyze which factors increase knowledge, social acceptance, and mainstreaming of NBS, promoting also a base-experience guideline to could be integrated with the stakeholder engagement strategy in current and future similar strongly collaborative approach-based environmental projects, such as OPERANDUM. Measurement will be carried out through survey submitted at a different timescale to the same sample (stakeholder: policy makers, business, researchers, interest groups). Changes will be recorded and analyzed through focus groups in order to highlight causal explanation and to assess the proposed list of indicators to steer the conduction of similar activities in other projects and/or contexts. The idea of the paper is to contribute to the construction of a more structured and shared corpus of indicators that can support the evaluation of the activities of involvement and participation of various levels of stakeholders in the co-production, planning, and implementation of NBS to address climate change challenges.

Keywords: co-creation and collaborative planning, monitoring, nature-based solution, participation & inclusion, stakeholder engagement

Procedia PDF Downloads 95
282 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 143