Search results for: information warfare techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16297

Search results for: information warfare techniques

9757 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh

Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi

Abstract:

Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.

Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region

Procedia PDF Downloads 75
9756 Towards Long-Range Pixels Connection for Context-Aware Semantic Segmentation

Authors: Muhammad Zubair Khan, Yugyung Lee

Abstract:

Deep learning has recently achieved enormous response in semantic image segmentation. The previously developed U-Net inspired architectures operate with continuous stride and pooling operations, leading to spatial data loss. Also, the methods lack establishing long-term pixels connection to preserve context knowledge and reduce spatial loss in prediction. This article developed encoder-decoder architecture with bi-directional LSTM embedded in long skip-connections and densely connected convolution blocks. The network non-linearly combines the feature maps across encoder-decoder paths for finding dependency and correlation between image pixels. Additionally, the densely connected convolutional blocks are kept in the final encoding layer to reuse features and prevent redundant data sharing. The method applied batch-normalization for reducing internal covariate shift in data distributions. The empirical evidence shows a promising response to our method compared with other semantic segmentation techniques.

Keywords: deep learning, semantic segmentation, image analysis, pixels connection, convolution neural network

Procedia PDF Downloads 99
9755 Synthesis of Polyvinyl Alcohol Encapsulated Ag Nanoparticle Film by Microwave Irradiation for Reduction of P-Nitrophenol

Authors: Supriya, J. K. Basu, S. Sengupta

Abstract:

Silver nanoparticles have caught a lot of attention because of its unique physical and chemical properties. Silver nanoparticles embedded in polyvinyl alcohol (PVA/Ag) free-standing film have been prepared by microwave irradiation in few minutes. PVA performed as a reducing agent, stabilizing agents as well as support for silver nanoparticles. UV-Vis spectrometry, scanning transmission electron (SEM) and transmission electron microscopy (TEM) techniques affirmed the reduction of silver ion to silver nanoparticles in the polymer matrix. Effect of irradiation time, the concentration of PVA and concentration of silver precursor on the synthesis of silver nanoparticle has been studied. Particles size of silver nanoparticles decreases with increase in irradiation time. Concentration of silver nanoparticles increases with increase in concentration of silver precursor. Good dispersion of silver nanoparticles in the film has been confirmed by TEM analysis. Particle size of silver nanoparticle has been found to be in the range of 2-10nm. Catalytic property of prepared silver nanoparticles as a heterogeneous catalyst has been studied in the reduction of p-Nitrophenol (a water pollutant) with >98% conversion. From the experimental results, it can be concluded that PVA encapsulated Ag nanoparticles film as a catalyst shows better efficiency and reusability in the reduction of p-Nitrophenol.

Keywords: biopolymer, microwave irradiation, silver nanoparticles, water pollutant

Procedia PDF Downloads 284
9754 The Use of Geographic Information System Technologies for Geotechnical Monitoring of Pipeline Systems

Authors: A. G. Akhundov

Abstract:

Issues of obtaining unbiased data on the status of pipeline systems of oil- and oil product transportation become especially important when laying and operating pipelines under severe nature and climatic conditions. The essential attention is paid here to researching exogenous processes and their impact on linear facilities of the pipeline system. Reliable operation of pipelines under severe nature and climatic conditions, timely planning and implementation of compensating measures are only possible if operation conditions of pipeline systems are regularly monitored, and changes of permafrost soil and hydrological operation conditions are accounted for. One of the main reasons for emergency situations to appear is the geodynamic factor. Emergency situations are proved by the experience to occur within areas characterized by certain conditions of the environment and to develop according to similar scenarios depending on active processes. The analysis of natural and technical systems of main pipelines at different stages of monitoring gives a possibility of making a forecast of the change dynamics. The integration of GIS technologies, traditional means of geotechnical monitoring (in-line inspection, geodetic methods, field observations), and remote methods (aero-visual inspection, aero photo shooting, air and ground laser scanning) provides the most efficient solution of the problem. The united environment of geo information system (GIS) is a comfortable way to implement the monitoring system on the main pipelines since it provides means to describe a complex natural and technical system and every element thereof with any set of parameters. Such GIS enables a comfortable simulation of main pipelines (both in 2D and 3D), the analysis of situations and selection of recommendations to prevent negative natural or man-made processes and to mitigate their consequences. The specifics of such systems include: a multi-dimensions simulation of facilities in the pipeline system, math modelling of the processes to be observed, and the use of efficient numeric algorithms and software packets for forecasting and analyzing. We see one of the most interesting possibilities of using the monitoring results as generating of up-to-date 3D models of a facility and the surrounding area on the basis of aero laser scanning, data of aerophotoshooting, and data of in-line inspection and instrument measurements. The resulting 3D model shall be the basis of the information system providing means to store and process data of geotechnical observations with references to the facilities of the main pipeline; to plan compensating measures, and to control their implementation. The use of GISs for geotechnical monitoring of pipeline systems is aimed at improving the reliability of their operation, reducing the probability of negative events (accidents and disasters), and at mitigation of consequences thereof if they still are to occur.

Keywords: databases, 3D GIS, geotechnical monitoring, pipelines, laser scaning

Procedia PDF Downloads 187
9753 A Metaheuristic Approach for Optimizing Perishable Goods Distribution

Authors: Bahare Askarian, Suchithra Rajendran

Abstract:

Maintaining the freshness and quality of perishable goods during distribution is a critical challenge for logistics companies. This study presents a comprehensive framework aimed at optimizing the distribution of perishable goods through a mathematical model of the Transportation Inventory Location Routing Problem (TILRP). The model incorporates the impact of product age on customer demand, addressing the complexities associated with inventory management and routing. To tackle this problem, we develop both simple and hybrid metaheuristic algorithms designed for small- and medium-scale scenarios. The hybrid algorithm combines Biogeographical Based Optimization (BBO) algorithms with local search techniques to enhance performance in small- and medium-scale scenarios, extending our approach to larger-scale challenges. Through extensive numerical simulations and sensitivity analyses across various scenarios, the performance of the proposed algorithms is evaluated, assessing their effectiveness in achieving optimal solutions. The results demonstrate that our algorithms significantly enhance distribution efficiency, offering valuable insights for logistics companies striving to improve their perishable goods supply chains.

Keywords: perishable goods, meta-heuristic algorithm, vehicle problem, inventory models

Procedia PDF Downloads 1
9752 Electrochemical Reduction of Carbon-dioxide Using Metal Nano-particles Supported on Nano-Materials

Authors: Mulatu Kassie Birhanu

Abstract:

Electrochemical reduction of CO₂ is an emerging and current issue for its conversion in to valuable product upon minimization of its atmospheric level for contribution of maintaining within the range of permissible limit. Among plenty of electro-catalysts gold and copper are efficient and effective catalysts, which are synthesized and applicable for this research work. The two metal catalysts were prepared in inert environment with different compositions through co-reduction process from their corresponding precursors and then by adding multi-walled carbon nano-tube as a supporter and enhanced the conductivity. The catalytic performance of CO₂ reduction for each composition was performed and resulted an outstanding catalytic activity with generation of high current density (70 mA/cm² at 0.91V vs. RHE) and relatively small onset potential. The catalytic performance, compositions, morphologies, structure and geometric arrangements were evaluated by electrochemical analysis (LSV, impedance, chronoamperometry & tafel plot), EDS, SEM and XAS respectively. The composite metals showed better selectivity of products and faradaic efficiencies due to the synergetic effects of the combined nano-particles in addition to the impact of grain size in reduction of CO₂. Carbon monoxide, hydrogen, formate and ethanol are the reduction products, which are detected and quantifiable by chromatographic techniques considering their physical state of each product.

Keywords: carbondioxide, faradaic efficiency, electrocatalyst, current density

Procedia PDF Downloads 51
9751 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 575
9750 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows

Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican

Abstract:

This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.

Keywords: laboratory-process, optimization, pathology, computer simulation, workflow

Procedia PDF Downloads 283
9749 An Unsupervised Domain-Knowledge Discovery Framework for Fake News Detection

Authors: Yulan Wu

Abstract:

With the rapid development of social media, the issue of fake news has gained considerable prominence, drawing the attention of both the public and governments. The widespread dissemination of false information poses a tangible threat across multiple domains of society, including politics, economy, and health. However, much research has concentrated on supervised training models within specific domains, their effectiveness diminishes when applied to identify fake news across multiple domains. To solve this problem, some approaches based on domain labels have been proposed. By segmenting news to their specific area in advance, judges in the corresponding field may be more accurate on fake news. However, these approaches disregard the fact that news records can pertain to multiple domains, resulting in a significant loss of valuable information. In addition, the datasets used for training must all be domain-labeled, which creates unnecessary complexity. To solve these problems, an unsupervised domain knowledge discovery framework for fake news detection is proposed. Firstly, to effectively retain the multidomain knowledge of the text, a low-dimensional vector for each news text to capture domain embeddings is generated. Subsequently, a feature extraction module utilizing the unsupervisedly discovered domain embeddings is used to extract the comprehensive features of news. Finally, a classifier is employed to determine the authenticity of the news. To verify the proposed framework, a test is conducted on the existing widely used datasets, and the experimental results demonstrate that this method is able to improve the detection performance for fake news across multiple domains. Moreover, even in datasets that lack domain labels, this method can still effectively transfer domain knowledge, which can educe the time consumed by tagging without sacrificing the detection accuracy.

Keywords: fake news, deep learning, natural language processing, multiple domains

Procedia PDF Downloads 88
9748 Development of Colorimetric Based Microfluidic Platform for Quantification of Fluid Contaminants

Authors: Sangeeta Palekar, Mahima Rana, Jayu Kalambe

Abstract:

In this paper, a microfluidic-based platform for the quantification of contaminants in the water is proposed. The proposed system uses microfluidic channels with an embedded environment for contaminants detection in water. Microfluidics-based platforms present an evident stage of innovation for fluid analysis, with different applications advancing minimal efforts and simplicity of fabrication. Polydimethylsiloxane (PDMS)-based microfluidics channel is fabricated using a soft lithography technique. Vertical and horizontal connections for fluid dispensing with the microfluidic channel are explored. The principle of colorimetry, which incorporates the use of Griess reagent for the detection of nitrite, has been adopted. Nitrite has high water solubility and water retention, due to which it has a greater potential to stay in groundwater, endangering aquatic life along with human health, hence taken as a case study in this work. The developed platform also compares the detection methodology, containing photodetectors for measuring absorbance and image sensors for measuring color change for quantification of contaminants like nitrite in water. The utilization of image processing techniques offers the advantage of operational flexibility, as the same system can be used to identify other contaminants present in water by introducing minor software changes.

Keywords: colorimetric, fluid contaminants, nitrite detection, microfluidics

Procedia PDF Downloads 194
9747 Finite Element Method (FEM) Simulation, design and 3D Print of Novel Highly Integrated PV-TEG Device with Improved Solar Energy Harvest Efficiency

Authors: Jaden Lu, Olivia Lu

Abstract:

Despite the remarkable advancement of solar cell technology, the challenge of optimizing total solar energy harvest efficiency persists, primarily due to significant heat loss. This excess heat not only diminishes solar panel output efficiency but also curtails its operational lifespan. A promising approach to address this issue is the conversion of surplus heat into electricity. In recent years, there is growing interest in the use of thermoelectric generators (TEG) as a potential solution. The integration of efficient TEG devices holds the promise of augmenting overall energy harvest efficiency while prolonging the longevity of solar panels. While certain research groups have proposed the integration of solar cells and TEG devices, a substantial gap between conceptualization and practical implementation remains, largely attributed to low thermal energy conversion efficiency of TEG devices. To bridge this gap and meet the requisites of practical application, a feasible strategy involves the incorporation of a substantial number of p-n junctions within a confined unit volume. However, the manufacturing of high-density TEG p-n junctions presents a formidable challenge. The prevalent solution often leads to large device sizes to accommodate enough p-n junctions, consequently complicating integration with solar cells. Recently, the adoption of 3D printing technology has emerged as a promising solution to address this challenge by fabricating high-density p-n arrays. Despite this, further developmental efforts are necessary. Presently, the primary focus is on the 3D printing of vertically layered TEG devices, wherein p-n junction density remains constrained by spatial limitations and the constraints of 3D printing techniques. This study proposes a novel device configuration featuring horizontally arrayed p-n junctions of Bi2Te3. The structural design of the device is subjected to simulation through the Finite Element Method (FEM) within COMSOL Multiphysics software. Various device configurations are simulated to identify optimal device structure. Based on the simulation results, a new TEG device is fabricated utilizing 3D Selective laser melting (SLM) printing technology. Fusion 360 facilitates the translation of the COMSOL device structure into a 3D print file. The horizontal design offers a unique advantage, enabling the fabrication of densely packed, three-dimensional p-n junction arrays. The fabrication process entails printing a singular row of horizontal p-n junctions using the 3D SLM printing technique in a single layer. Subsequently, successive rows of p-n junction arrays are printed within the same layer, interconnected by thermally conductive copper. This sequence is replicated across multiple layers, separated by thermal insulating glass. This integration created in a highly compact three-dimensional TEG device with high density p-n junctions. The fabricated TEG device is then attached to the bottom of the solar cell using thermal glue. The whole device is characterized, with output data closely matching with COMSOL simulation results. Future research endeavors will encompass the refinement of thermoelectric materials. This includes the advancement of high-resolution 3D printing techniques tailored to diverse thermoelectric materials, along with the optimization of material microstructures such as porosity and doping. The objective is to achieve an optimal and highly integrated PV-TEG device that can substantially increase the solar energy harvest efficiency.

Keywords: thermoelectric, finite element method, 3d print, energy conversion

Procedia PDF Downloads 65
9746 Structural Analysis of Polymer Thin Films at Single Macromolecule Level

Authors: Hiroyuki Aoki, Toru Asada, Tomomi Tanii

Abstract:

The properties of a spin-cast film of a polymer material are different from those in the bulk material because the polymer chains are frozen in an un-equilibrium state due to the rapid evaporation of the solvent. However, there has been little information on the un-equilibrated conformation and dynamics in a spin-cast film at the single chain level. The real-space observation of individual chains would provide direct information to discuss the morphology and dynamics of single polymer chains. The recent development of super-resolution fluorescence microscopy methods allows the conformational analysis of single polymer chain. In the current study, the conformation of a polymer chain in a spin-cast film by the super-resolution microscopy. Poly(methyl methacrylate) (PMMA) with the molecular weight of 2.2 x 10^6 was spin-cast onto a glass substrate from toluene and chloroform. For the super-resolution fluorescence imaging, a small amount of the PMMA labeled by rhodamine spiroamide dye was added. The radius of gyration (Rg) was evaluated from the super-resolution fluorescence image of each PMMA chain. The mean-square-root of Rg was 48.7 and 54.0 nm in the spin-cast films prepared from the toluene and chloroform solutions, respectively. On the other hand, the chain dimension in a bulk state (a thermally annealed 10- μm-thick sample) was observed to be 43.1 nm. This indicates that the PMMA chain in the spin-cast film takes an expanded conformation compared to the unperturbed chain and that the chain dimension is dependent on the solvent quality. In a good solvent, the PMMA chain has an expanded conformation by the excluded volume effect. The polymer chain is frozen before the relaxation from an un-equilibrated expanded conformation to an unperturbed one by the rapid solvent evaporation.

Keywords: chain conformation, polymer thin film, spin-coating, super-resolution optical microscopy

Procedia PDF Downloads 281
9745 Continuous Blood Pressure Measurement from Pulse Transit Time Techniques

Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen

Abstract:

Pulse Blood pressure (BP) is one of the vital signs, and is an index that helps determining the stability of life. In this respect, some spinal cord injury patients need to take the tilt table test. While doing the test, the posture changes abruptly, and may cause a patient’s BP to change abnormally. This may cause patients to feel discomfort, and even feel as though their life is threatened. Therefore, if a continuous non-invasive BP assessment system were built, it could help to alert health care professionals in the process of rehabilitation when the BP value is out of range. In our research, BP assessed by the pulse transit time technique was developed. In the system, we use a self-made photoplethysmograph (PPG) sensor and filter circuit to detect two PPG signals and to calculate the time difference. The BP can immediately be assessed by the trend line. According to the results of this study, the relationship between the systolic BP and PTT has a highly negative linear correlation (R2=0.8). Further, we used the trend line to assess the value of the BP and compared it to a commercial sphygmomanometer (Omron MX3); the error rate of the system was found to be in the range of ±10%, which is within the permissible error range of a commercial sphygmomanometer. The continue blood pressure measurement from pulse transit time technique may have potential to become a convenience method for clinical rehabilitation.

Keywords: continous blood pressure measurement, PPG, time transit time, transit velocity

Procedia PDF Downloads 349
9744 Data Structure Learning Platform to Aid in Higher Education IT Courses (DSLEP)

Authors: Estevan B. Costa, Armando M. Toda, Marcell A. A. Mesquita, Jacques D. Brancher

Abstract:

The advances in technology in the last five years allowed an improvement in the educational area, as the increasing in the development of educational software. One of the techniques that emerged in this lapse is called Gamification, which is the utilization of video game mechanics outside its bounds. Recent studies involving this technique provided positive results in the application of these concepts in many areas as marketing, health and education. In the last area there are studies that cover from elementary to higher education, with many variations to adequate to the educators methodologies. Among higher education, focusing on IT courses, data structures are an important subject taught in many of these courses, as they are base for many systems. Based on the exposed this paper exposes the development of an interactive web learning environment, called DSLEP (Data Structure Learning Platform), to aid students in higher education IT courses. The system includes basic concepts seen on this subject such as stacks, queues, lists, arrays, trees and was implemented to ease the insertion of new structures. It was also implemented with gamification concepts, such as points, levels, and leader boards, to engage students in the search for knowledge and stimulate self-learning.

Keywords: gamification, Interactive learning environment, data structures, e-learning

Procedia PDF Downloads 488
9743 Isolation and Identification of Diacylglycerol Acyltransferase Type-2 (GAT2) Genes from Three Egyptian Olive Cultivars

Authors: Yahia I. Mohamed, Ahmed I. Marzouk, Mohamed A. Yacout

Abstract:

Aim of this work was to study the genetic basis for oil accumulation in olive fruit via tracking DGAT2 (Diacylglycerol acyltransferase type-2) gene in three Egyptian Origen Olive cultivars namely Toffahi, Hamed and Maraki using molecular marker techniques and bioinformatics tools. Results illustrate that, firstly: specific genomic band of Maraki cultivars was identified as DGAT2 (Diacylglycerol acyltransferase type-2) and identical for this gene in Olea europaea with 100 % of similarity. Secondly, differential genomic band of Maraki cultivars which produced from RAPD fingerprinting technique reflected predicted distinguished sequence which identified as DGAT2 (Diacylglycerol acyltransferase type-2) in Fragaria vesca subsp. Vesca with 76% of sequential similarity. Third and finally, specific genomic specific band of Hamed cultivars was indentified as two fragments, 1-Olea europaea cultivar Koroneiki diacylglycerol acyltransferase type 2 mRNA, complete cds with two matches regions with 99% or 2-PREDICTED: Fragaria vesca subsp. vesca diacylglycerol O-acyltransferase 2-like (LOC101313050), mRNA with 86% of similarity.

Keywords: Olea europaea, fingerprinting, diacylglycerol acyltransferase type-2 (DGAT2), Egypt

Procedia PDF Downloads 496
9742 Gariep Dam Basin Management for Satisfying Ecological Flow Requirements

Authors: Dimeji Abe, Nonso Okoye, Gideon Ikpimi, Prince Idemudia

Abstract:

Multi-reservoir optimization operation has been a critical issue for river basin management. Water, as a scarce resource, is in high demand and the problems associated with the reservoir as its storage facility are enormous. The complexity in balancing the supply and demand of this prime resource has created the need to examine the best way to solve the problem using optimization techniques. The objective of this study is to evaluate the performance of the multi-objective meta-heuristic algorithm for the operation of Gariep Dam for satisfying ecological flow requirements. This study uses an evolutionary algorithm called backtrack search algorithm (BSA) to determine the best way to optimise the dam operations of hydropower production, flood control, and water supply without affecting the environmental flow requirement for the survival of aquatic bodies and sustain life downstream of the dam. To achieve this objective, the operations of the dam that corresponds to different tradeoffs between the objectives are optimized. The results indicate the best model from the algorithm that satisfies all the objectives without any constraint violation. It is expected that hydropower generation will be improved and more water will be available for ecological flow requirements with the use of the algorithm. This algorithm also provides farmers with more irrigation water as well to improve their business.

Keywords: BSA evolutionary algorithm, metaheuristics, optimization, river basin management

Procedia PDF Downloads 241
9741 Nano-Particle of π-Conjugated Polymer for Near-Infrared Bio-Imaging

Authors: Hiroyuki Aoki

Abstract:

Molecular imaging has attracted much attention recently, which visualizes biological molecules, cells, tissue, and so on. Among various in vivo imaging techniques, the fluorescence imaging method has been widely employed as a useful modality for small animals in pre-clinical researches. However, the higher signal intensity is needed for highly sensitive in vivo imaging. The objective of the current study is the development of a fluorescent imaging agent with high brightness for the tumor imaging of a mouse. The strategy to enhance the fluorescence signal of a bio-imaging agent is the increase of the absorption of the excitation light and the fluorescence conversion efficiency. We developed a nano-particle fluorescence imaging agent consisting of a π-conjugated polymer emitting a fluorescence signal in a near infrared region. A large absorption coefficient and high emission intensity at a near infrared optical window for biological tissue enabled highly sensitive in vivo imaging with a tumor-targeting ability by an EPR (enhanced permeation and retention) effect. The signal intensity from the π-conjugated fluorescence imaging agent is larger by two orders of magnitude compared to a quantum dot, which has been known as the brightest imaging agent. The π-conjugated polymer nano-particle would be a promising candidate in the in vivo imaging of small animals.

Keywords: fluorescence, conjugated polymer, in vivo imaging, nano-particle, near-infrared

Procedia PDF Downloads 473
9740 Poland and the Dawn of the Right to Education and Development: Moving Back in Time

Authors: Magdalena Zabrocka

Abstract:

The terror of women throughout the governance of the current populist ruling party in Poland, PiS, has been a subject of a heated debate alongside the issues of minorities’ rights, the rule of law, and democracy in the country. The challenges that women and other vulnerable groups are currently facing, however, come down to more than just a lack of comprehensive equality laws, severely limited reproductive rights, hateful slogans, and messages propagated by the central authority and its sympathisers, or a common disregard for women’s fundamental rights. Many sources and media reports are available only in Polish, while international rapporteurs fail to acknowledge the whole picture of the tragedy happening in the country and the variety of factors affecting it. Starting with the authorities’ and Polish catholic church’s propaganda concerning CEDAW and the Istanbul Convention Action against Violence against Women and Domestic Violence by spreading strategic disinformation that it codifies ‘gender ideology’ and ‘anti-Christian values’ in order to convince the electorate that the legal instruments should be ‘abandoned’. Alongside severely restricted abortion rights, bullying medical professionals helping women exercise their reproductive rights, violating women’s privacy by introducing a mandatory registry of pregnancies (so that one’s pregnancy or its ‘loss’ can be tracked and traced), restricting access to the ‘day after pill’ and real sex education at schools (most schools have a subject of ‘knowledge of living in a family’), introducing prison punishment for teachers accused of spreading ‘sex education’, and many other, the current tyrant government, has now decided to target the youngest with its misinformation and indoctrination, via strategically designed textbooks and curriculum. Biology books have seen a big restriction on the size of the chapters devoted to evolution, reproductive system, and sexual health. Approved religion books (which are taught 2-3 times a week as compared to 1 a week sciences) now cover false information about Darwin’s theory and arguments ‘against it’. Most recently, however, the public spoke up against the absurd messages contained in the politically rewritten history books, where the material about some figures not liked by the governing party has already been manipulated. In the recently approved changes to the history textbook, one can find a variety of strongly biased and politically-charged views representative of the conservatives in the states, most notably, equating the ‘gender ideology’ and feminism with Nazism. Thus, this work, by employing a human rights approach, would focus on the right to education and development as well as the considerate obstacles to access to scientific information by the youth.

Keywords: Poland, right to education, right to development, authoritarianism, access to information

Procedia PDF Downloads 96
9739 Comparison of Sensitivity and Specificity of Pap Smear and Polymerase Chain Reaction Methods for Detection of Human Papillomavirus: A Review of Literature

Authors: M. Malekian, M. E. Heydari, M. Irani Estyar

Abstract:

Human papillomavirus (HPV) is one of the most common sexually transmitted infection, which may lead to cervical cancer as the main cause of it. With early diagnosis and treatment in health care services, cervical cancer and its complications are considered to be preventable. This study was aimed to compare the efficiency, sensitivity, and specificity of Pap smear and polymerase chain reaction (PCR) in detecting HPV. A literature search was performed in Google Scholar, PubMed and SID databases using the keywords 'human papillomavirus', 'pap smear' and 'polymerase change reaction' to identify studies comparing Pap smear and PCR methods for the detection. No restrictions were considered.10 studies were included in this review. All samples that were positive by pop smear were also positive by PCR. However, there were positive samples detected by PCR which was negative by pop smear and in all studies, many positive samples were missed by pop smear technique. Although The Pap smear had high specificity, PCR based HPV detection was more sensitive method and had the highest sensitivity. In order to promote the quality of detection and high achievement of the maximum results, PCR diagnostic methods in addition to the Pap smear are needed and Pap smear method should be combined with PCR techniques according to the high error rate of Pap smear in detection.

Keywords: human papillomavirus, cervical cancer, pap smear, polymerase chain reaction

Procedia PDF Downloads 127
9738 The Changes in Motivations and the Use of Translation Strategies in Crowdsourced Translation: A Case Study on Global Voices’ Chinese Translation Project

Authors: Ya-Mei Chen

Abstract:

Online crowdsourced translation, an innovative translation practice brought by Web 2.0 technologies and the democratization of information, has become increasingly popular in the Internet era. Carried out by grass-root internet users, crowdsourced translation contains fundamentally different features from its off-line traditional counterpart, such as voluntary participation and parallel collaboration. To better understand such a participatory and collaborative nature, this paper will use the online Chinese translation project of Global Voices as a case study to investigate the following issues: (1) the changes in volunteer translators’ and reviewers’ motivations for participation, (2) translators’ and reviewers’ use of translation strategies and (3) the correlations of translators’ and reviewers’ motivations and strategies with the organizational mission, the translation style guide, the translator-reviewer interaction, the mediation of the translation platform and various types of capital within the translation field. With an aim to systematically explore the above three issues, this paper will collect both quantitative and qualitative data and then draw upon Engestrom’s activity theory and Bourdieu’s field theory as a theoretical framework to analyze the data in question. An online anonymous questionnaire will be conducted to obtain the quantitative data. The questionnaire will contain questions related to volunteer translators’ and reviewers’ backgrounds, participation motivations, translation strategies and mutual relations as well as the operation of the translation platform. Concerning the qualitative data, they will come from (1) a comparative study between some English news texts published on Global Voices and their Chinese translations, (2) an analysis of the online discussion forum associated with Global Voices’ Chinese translation project and (3) the information about the project’s translation mission and guidelines. It is hoped that this research, through a detailed sociological analysis of a cause-driven crowdsourced translation project, can enable translation researchers and practitioners to adequately meet the translation challenges appearing in the digital age.

Keywords: crowdsourced translation, global voices, motivation, translation strategies

Procedia PDF Downloads 367
9737 Medical Image Augmentation Using Spatial Transformations for Convolutional Neural Network

Authors: Trupti Chavan, Ramachandra Guda, Kameshwar Rao

Abstract:

The lack of data is a pain problem in medical image analysis using a convolutional neural network (CNN). This work uses various spatial transformation techniques to address the medical image augmentation issue for knee detection and localization using an enhanced single shot detector (SSD) network. The spatial transforms like a negative, histogram equalization, power law, sharpening, averaging, gaussian blurring, etc. help to generate more samples, serve as pre-processing methods, and highlight the features of interest. The experimentation is done on the OpenKnee dataset which is a collection of knee images from the openly available online sources. The CNN called enhanced single shot detector (SSD) is utilized for the detection and localization of the knee joint from a given X-ray image. It is an enhanced version of the famous SSD network and is modified in such a way that it will reduce the number of prediction boxes at the output side. It consists of a classification network (VGGNET) and an auxiliary detection network. The performance is measured in mean average precision (mAP), and 99.96% mAP is achieved using the proposed enhanced SSD with spatial transformations. It is also seen that the localization boundary is comparatively more refined and closer to the ground truth in spatial augmentation and gives better detection and localization of knee joints.

Keywords: data augmentation, enhanced SSD, knee detection and localization, medical image analysis, openKnee, Spatial transformations

Procedia PDF Downloads 148
9736 Estimating View-Through Ad Attribution from User Surveys Using Convex Optimization

Authors: Yuhan Lin, Rohan Kekatpure, Cassidy Yeung

Abstract:

In Digital Marketing, robust quantification of View-through attribution (VTA) is necessary for evaluating channel effectiveness. VTA occurs when a product purchase is aided by an Ad but without an explicit click (e.g. a TV ad). A lack of a tracking mechanism makes VTA estimation challenging. Most prevalent VTA estimation techniques rely on post-purchase in-product user surveys. User surveys enable the calculation of channel multipliers, which are the ratio of the view-attributed to the click-attributed purchases of each marketing channel. Channel multipliers thus provide a way to estimate the unknown VTA for a channel from its known click attribution. In this work, we use Convex Optimization to compute channel multipliers in a way that enables a mathematical encoding of the expected channel behavior. Large fluctuations in channel attributions often result from overfitting the calculations to user surveys. Casting channel attribution as a Convex Optimization problem allows an introduction of constraints that limit such fluctuations. The result of our study is a distribution of channel multipliers across the entire marketing funnel, with important implications for marketing spend optimization. Our technique can be broadly applied to estimate Ad effectiveness in a privacy-centric world that increasingly limits user tracking.

Keywords: digital marketing, survey analysis, operational research, convex optimization, channel attribution

Procedia PDF Downloads 188
9735 Novel Marketing Strategy To Increase Sales Revenue For SMEs Through Social Media

Authors: Kruti Dave

Abstract:

Social media marketing is an essential component of 21st-century business. Social media platforms enable small and medium-sized businesses to enhance brand recognition, generate leads and sales. However, the research on social media marketing is still fragmented and focuses on specific topics, such as effective communication techniques. Since the various ways in which social media impacts individuals and companies alike, the authors of this article focus on the origin, impacts, and current state of Social Media, emphasizing their significance as customer empowerment agents. It illustrates their potential and current responsibilities as part of the corporate business strategy and also suggests several methods to engage them as marketing tools. The focus of social media marketing ranges from defenders to explorers, the culture of Social media marketing encompasses the poles of conservatism and modernity, social media marketing frameworks lie between hierarchies and networks, and its management goes from autocracy to anarchy. This research proposes an integrative framework for small and medium-sized businesses through social media, and the influence of the same will be measured. This strategy will help industry experts to understand this new era. We propose an axiom: Social Media is always a function of marketing as a revenue generator.

Keywords: social media, marketing strategy, media marketing, brand awareness, customer engagement, revenue generator, brand recognition

Procedia PDF Downloads 184
9734 Horizontal Cooperative Game Theory in Hotel Revenue Management

Authors: Ririh Rahma Ratinghayu, Jayu Pramudya, Nur Aini Masruroh, Shi-Woei Lin

Abstract:

This research studies pricing strategy in cooperative setting of hotel duopoly selling perishable product under fixed capacity constraint by using the perspective of managers. In hotel revenue management, competitor’s average room rate and occupancy rate should be taken into manager’s consideration in determining pricing strategy to generate optimum revenue. This information is not provided by business intelligence or available in competitor’s website. Thus, Information Sharing (IS) among players might result in improved performance of pricing strategy. IS is widely adopted in the logistics industry, but IS within hospitality industry has not been well-studied. This research put IS as one of cooperative game schemes, besides Mutual Price Setting (MPS) scheme. In off-peak season, hotel manager arranges pricing strategy to offer promotion package and various kinds of discounts up to 60% of full-price to attract customers. Competitor selling homogenous product will react the same, then triggers a price war. Price war which generates lower revenue may be avoided by creating collaboration in pricing strategy to optimize payoff for both players. In MPS cooperative game, players collaborate to set a room rate applied for both players. Cooperative game may avoid unfavorable players’ payoff caused by price war. Researches on horizontal cooperative game in logistics show better performance and payoff for the players, however, horizontal cooperative game in hotel revenue management has not been demonstrated. This paper aims to develop hotel revenue management models under duopoly cooperative schemes (IS & MPS), which are compared to models under non-cooperative scheme too. Each scheme has five models, Capacity Allocation Model; Demand Model; Revenue Model; Optimal Price Model; and Equilibrium Price Model. Capacity Allocation Model and Demand Model employs self-hotel and competitor’s full and discount price as predictors under non-linear relation. Optimal price is obtained by assuming revenue maximization motive. Equilibrium price is observed by interacting self-hotel’s and competitor’s optimal price under reaction equation. Equilibrium is analyzed using game theory approach. The sequence applies for three schemes. MPS Scheme differently aims to optimize total players’ payoff. The case study in which theoretical models are applied observes two hotels offering homogenous product in Indonesia during a year. The Capacity Allocation, Demand, and Revenue Models are built using multiple regression and statistically tested for validation. Case study data confirms that price behaves within demand model in a non-linear manner. IS Models can represent the actual demand and revenue data better than Non-IS Models. Furthermore, IS enables hotels to earn significantly higher revenue. Thus, duopoly hotel players in general, might have reasonable incentives to share information horizontally. During off-peak season, MPS Models are able to predict the optimal equal price for both hotels. However, Nash equilibrium may not always exist depending on actual payoff of adhering or betraying mutual agreement. To optimize performance, horizontal cooperative game may be chosen over non-cooperative game. Mathematical models can be used to detect collusion among business players. Empirical testing can be used as policy input for market regulator in preventing unethical business practices potentially harming society welfare.

Keywords: horizontal cooperative game theory, hotel revenue management, information sharing, mutual price setting

Procedia PDF Downloads 286
9733 Competitor Integration with Voice of Customer Ratings in QFD Studies Using Geometric Mean Based on AHP

Authors: Zafar Iqbal, Nigel P. Grigg, K. Govindaraju, Nicola M. Campbell-Allen

Abstract:

Quality Function Deployment (QFD) is structured approach. It has been used to improve the quality of products and process in a wide range of fields. Using this systematic tool, practitioners normally rank Voice of Customer ratings (VoCs) in order to produce Improvement Ratios (IRs) which become the basis for prioritising process / product design or improvement activities. In one matrix of the House of Quality (HOQ) competitors are rated. The method of obtaining improvement ratios (IRs) does not always integrate the competitors’ rating in a systematic way that fully utilises competitor rating information. This can have the effect of diverting QFD practitioners’ attention from a potentially important VOC to less important VOC. In order to enhance QFD analysis, we present a more systematic method for integrating competitor ratings, utilising the geometric mean of the customer rating matrix. In this paper we develop a new approach, based on the Analytic Hierarchy Process (AHP), in which we generating a matrix of multiple comparisons of all competitors, and derive a geometric mean for each competitor. For each VOC an improved IR is derived which-we argue herein - enhances the initial VOC importance ratings by integrating more information about competitor performance. In this way, our method can help overcome one of the possible shortcomings of QFD. We then use a published QFD example from literature as a case study to demonstrate the use of the new AHP-based IRs, and show how these can be used to re-rank existing VOCs to -arguably- better achieve the goal of customer satisfaction in relation VOC ratings and competitors’ rankings. We demonstrate how two dimensional AHP-based geometric mean derived from the multiple competitor comparisons matrix can be useful for analysing competitors’ rankings. Our method utilises an established methodology (AHP) applied within an established application (QFD), but in an original way (through the competitor analysis matrix), to achieve a novel improvement.

Keywords: quality function deployment, geometric mean, improvement ratio, AHP, competitors ratings

Procedia PDF Downloads 363
9732 Long-Term Climate Patterns in Eastern and Southeastern Ethiopia

Authors: Messay Mulugeta, Degefa Tolossa

Abstract:

The purpose of this paper is to scrutinize trends of climate risks in eastern and southeastern parts of Ethiopia. This part of the country appears severely affected by recurrent droughts, erratic rainfall, and increasing temperature condition. Particularly, erratic rains and moisture stresses have been forcibly threatening and shoving the people over many decades coupled with unproductive policy frameworks and weak institutional setups. These menaces have been more severe in dry lowlands where rainfall is more erratic and scarce. Long-term climate data of nine weather stations in eastern and southeastern parts of Ethiopia were obtained from National Meteorological Agency of Ethiopia (NMA). As issues related to climate risks are very intricate, different techniques and indices were applied to deal with the objectives of the study. It is concluded that erratic rainfall, moisture scarcity, and increasing temperature conditions have been the main challenges in eastern and southeastern Ethiopia. In fact, these risks can be eased by putting in place efficient and integrated rural development strategies, environmental rehabilitation plans of action in overworked areas, proper irrigation and water harvesting practices and well thought-out and genuine resettlement schemes.

Keywords: rainfall variability, erratic rains, precipitation concentration index (PCI), climatic pattern, Ethiopia

Procedia PDF Downloads 231
9731 A Sensitive Uric Acid Electrochemical Sensing in Biofluids Based on Ni/Zn Hydroxide Nanocatalyst

Authors: Nathalia Florencia Barros Azeredo, Josué Martins Gonçalves, Pamela De Oliveira Rossini, Koiti Araki, Lucio Angnes

Abstract:

This work demonstrates the electroanalysis of uric acid (UA) at very low working potential (0 V vs Ag/AgCl) directly in body fluids such as saliva and sweat using electrodes modified with mixed -Ni0.75Zn0.25(OH)2 nanoparticles exhibiting stable electrocatalytic responses from alkaline down to weakly acidic media (pH 14 to 3 range). These materials were prepared for the first time and fully characterized by TEM, XRD, and spectroscopic techniques. The electrochemical properties of the modified electrodes were evaluated in a fast and simple procedure for uric acid analyses based on cyclic voltammetry and chronoamperometry, pushing down the detection and quantification limits (respectively of 2.3*10-8 and 7.6*10-8 mol L-1) with good repeatability (RSD = 3.2% for 30 successive analyses pH 14). Finally, the possibility of real application was demonstrated upon realization of unexpectedly robust and sensitive modified FTO (fluorine doped tin oxide) glass and screen-printed sensors for measurement of uric acid directly in real saliva and sweat samples, with no significant interference of usual concentrations of ascorbic acid, acetaminophen, lactate and glucose present in those body fluids (Fig. 1).

Keywords: nickel hydroxide, mixed catalyst, uric acid sensors, biofluids

Procedia PDF Downloads 126
9730 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques

Authors: Ved Kulkarni, Karthik Kini

Abstract:

This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.

Keywords: data mining, language processing, artificial neural networks, sentiment analysis

Procedia PDF Downloads 7
9729 Surgical Planning for the Removal of Cranial Spheno-orbital Meningioma by Using Personalized Polymeric Prototypes Obtained with Additive Manufacturing Techniques

Authors: Freddy Patricio Moncayo-Matute, Pablo Gerardo Peña-Tapia, Vázquez-Silva Efrén, Paúl Bolívar Torres-Jara, Diana Patricia Moya-Loaiza, Gabriela Abad-Farfán

Abstract:

This study describes a clinical case and the results on the application of additive manufacturing for the surgical planning in the removal of a cranial spheno-orbital meningioma. It is verified that the use of personalized anatomical models and cutting guides helps to manage the cranial anomalies approach. The application of additive manufacturing technology: Fused Deposition Modeling (FDM), as a low-cost alternative, enables the printing of the test anatomical model, which in turn favors the reduction of surgery time, as well the morbidity rate reduction too. And the printing of the personalized cutting guide, which constitutes a valuable aid to the surgeon in terms of improving the intervention precision and reducing the invasive effect during the craniotomy. As part of the results, post-surgical follow-up is included as an instrument to verify the patient's recovery and the validity of the procedure.

Keywords: surgical planning, additive manufacturing, rapid prototyping, fused deposition modeling, custom anatomical model

Procedia PDF Downloads 89
9728 Opportunity Development and Entrepreneurial Process

Authors: Abosede Mosunmola Odeseye

Abstract:

The sustainability of nations’ economies today have proven to be unrealistic in a constantly changing world without appropriate accordance to entrepreneurship role and its processes. This role has therefore proven to be a product of the available and discoverable opportunities by an individual/organisation in any pattern – innovation, discovery, diffusion, imitation amidst possible challenges. In light of these, this paper examined the relationship between opportunity development and entrepreneurial processes as well as the factors determining individual’s opportunity development and the success of entrepreneurial processes. Systematic review method was adopted for selecting relevant academic materials. The theoretical base of this paper was anchored on Schumpeter’s entrepreneurial innovation model and Drucker and Stevenson’s opportunity-based entrepreneurship theory. Based on the reviewed literature, it was discovered that rough business idea “opportunity” in any form – techniques/product encounter various obstacles to achieve its development, acceptability and sustainability. In essence, the findings revealed that the birth of every opportunity is as a result of the individual/organisation and environmental factors to be able to scale through the whole process successfully. Due to the outcome of this paper, it was recommended that the organisations/government should endeavour to create an enabling environment for a rough business idea to come to life amidst the hurdles of the entrepreneurial process.

Keywords: entrepreneurial process, entrepreneurship, opportunity, opportunity development, organisation, sustainability

Procedia PDF Downloads 236