Search results for: time domain reflectometry (TDR)
17670 Distributed Cost-Based Scheduling in Cloud Computing Environment
Authors: Rupali, Anil Kumar Jaiswal
Abstract:
Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc. Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively. Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.Keywords: physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model
Procedia PDF Downloads 16817669 A Non-linear Damage Model For The Annulus Of the Intervertebral Disc Under Cyclic Loading, Including Recovery
Authors: Shruti Motiwale, Xianlin Zhou, Reuben H. Kraft
Abstract:
Military and sports personnel are often required to wear heavy helmets for extended periods of time. This leads to excessive cyclic loads on the neck and an increased chance of injury. Computational models offer one approach to understand and predict the time progression of disc degeneration under severe cyclic loading. In this paper, we have applied an analytic non-linear damage evolution model to estimate damage evolution in an intervertebral disc due to cyclic loads over decade-long time periods. We have also proposed a novel strategy for inclusion of recovery in the damage model. Our results show that damage only grows 20% in the initial 75% of the life, growing exponentially in the remaining 25% life. The analysis also shows that it is crucial to include recovery in a damage model.Keywords: cervical spine, computational biomechanics, damage evolution, intervertebral disc, continuum damage mechanics
Procedia PDF Downloads 56817668 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements
Authors: Ebru Turgal, Beyza Doganay Erdogan
Abstract:
Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data
Procedia PDF Downloads 20317667 ‘Ethical Relativism’ in Offshore Business: A Critical Assessment
Authors: Biswanath Swain
Abstract:
Ethical relativism, as an ethical perspective, holds that moral worth of a course of action is dependent on a particular space and time. Moral rightness or wrongness of a course of action varies from space to space and from time to time. In short, ethical relativism holds that morality is relative to the context. If we reflect conscientiously on the scope of this perspective, we will find that it is wide-spread amongst the marketers involved in the offshore business. However, the irony is that most of the marketers gone along with ethical relativism in their offshore business have been found to be unsuccessful in terms of loss in market-share and bankruptcy. The upshot is purely self-defeating in nature for the marketers. GSK in China and Nestle Maggi in India are some of the burning examples of that sort. The paper argues and recommends that a marketer, as an alternative, should have recourse to Kantian ethical perspective to deliberate courses of action sensitive to offshore business as Kantian ethical perspective is logically and methodologically sound in nature.Keywords: business, course of action, Kant, morality, offshore, relativism
Procedia PDF Downloads 30317666 Identification and Characterization of Novel Genes Involved in Quinone Synthesis in the Odoriferous Defensive Stink Glands of the Red Flour Beetle, Tribolium castaneum
Authors: B. Atika, S. Lehmann, E. Wimmer
Abstract:
The defense strategy is very common in the insect world. Defensive substances play a wide variety of functions for beetles, such as repellents, toxicants, insecticides, and antimicrobics. Beetles react to predators, invaders, and parasitic microbes with the release of toxic and repellent substances. Defensive substances are directed against a large array of potential target organisms or may function for boiling bombardment or as surfactants. Usually, Coleoptera biosynthesize and store their defensive compounds in a complex secretory organ, known as odoriferous defensive stink glands. The red flour beetle, Tribolium castaneum (Coleoptera: Tenebrionidae), uses these glands to produce antimicrobial p-benzoquinones and 1-alkenes. In the past, the morphology of stink gland has been studied in detail in tenebrionid beetles; however, very little is known about the genes that are involved in the production of gland secretion. In this study, we studied a subset of genes that are essential for the benzoquinone production in red flour beetle. In the first phase, we selected 74 potential candidate genes from a genome-wide RNA interference (RNAi) knockdown screen named 'iBeetle.' All these 74 candidate genes were functionally characterized by RNAi-mediated gene knockdown. Therefore, they were selected for a subsequent gas chromatography-mass spectrometry (GC-MS) analysis of secretion volatiles in respective RNAi knockdown glands. 33 of them were observed to alter the phenotype of stink gland. In the GC-MS analysis, 7 candidate genes were noted to display a strongly altered gland, in terms of secretion color and chemical composition, upon knockdown, showing their key role in the biosynthesis of gland secretion. Morphologically altered stink glands were found for odorant receptor and protein kinase superfamily. Subsequent GC-MS analysis of secretion volatiles revealed reduced benzoquinone levels in LIM domain, PDZ domain, PBP/GOBP family knockdowns and a complete lack of benzoquinones in the knockdown of sulfatase-modifying factor enzyme 1, sulfate transporter family. Based on stink gland transcriptome data, we analyzed the function of sulfatase-modifying factor enzyme 1 and sulfate transporter family via RNAi-mediated gene knockdowns, GC-MS, in situ hybridization, and enzymatic activity assays. Morphologically altered stink glands were noted in knockdown of both these genes. Furthermore, GC-MS analysis of secretion volatiles showed a complete lack of benzoquinones in the knockdown of these two genes. In situ hybridization showed that these two genes are expressed around the vesicle of certain subgroup of secretory stink gland cells. Enzymatic activity assays on stink gland tissue showed that these genes are involved in p-benzoquinone biosynthesis. These results suggest that sulfatase-modifying factor enzyme 1 and sulfate transporter family play a role specifically in benzoquinone biosynthesis in red flour beetles.Keywords: Red Flour Beetle, defensive stink gland, benzoquinones, sulfate transporter, sulfatase-modifying factor enzyme 1
Procedia PDF Downloads 15517665 Foggy Image Restoration Using Neural Network
Authors: Khader S. Al-Aidmat, Venus W. Samawi
Abstract:
Blurred vision in the misty atmosphere is essential problem which needs to be resolved. To solve this problem, we developed a technique to restore foggy degraded image from its original version using Back-propagation neural network (BP-NN). The suggested technique is based on mapping between foggy scene and its corresponding original scene. Seven different approaches are suggested based on type of features used in image restoration. Features are extracted from spatial and spatial-frequency domain (using DCT). Each of these approaches comes with its own BP-NN architecture depending on type and number of used features. The weight matrix resulted from training each BP-NN represents a fog filter. The performance of these filters are evaluated empirically (using PSNR), and perceptually. By comparing the performance of these filters, the effective features that suits BP-NN technique for restoring foggy images is recognized. This system proved its effectiveness and success in restoring moderate foggy images.Keywords: artificial neural network, discrete cosine transform, feed forward neural network, foggy image restoration
Procedia PDF Downloads 38217664 Evaluating the Use of Digital Art Tools for Drawing to Enhance Artistic Ability and Improve Digital Skill among Junior School Students
Authors: Aber Salem Aboalgasm, Rupert Ward
Abstract:
This study investigated some results of the use of digital art tools by junior school children in order to discover if these tools could promote artistic ability and creativity. The study considers the ease of use and usefulness of the tools as well as how to assess artwork produced by digital means. As the use of these tools is a relatively new development in Art education, this study may help educators in their choice of which tools to use and when to use them. The study also aims to present a model for the assessment of students’ artistic development and creativity by studying their artistic activity. This model can help in determining differences in students’ creative ability and could be useful both for teachers, as a means of assessing digital artwork, and for students, by providing the motivation to use the tools to their fullest extent. Sixteen students aged nine to ten years old were observed and recorded while they used the digital drawing tools. The study found that, according to the students’ own statements, it was not the ease of use but the successful effects the tools provided which motivated the children to use them.Keywords: artistic ability, creativity, drawing digital tool, TAM model, psychomotor domain
Procedia PDF Downloads 33017663 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 57217662 The Optimization Process of Aortic Heart Valve Stent Geometry
Authors: Arkadiusz Mezyk, Wojciech Klein, Mariusz Pawlak, Jacek Gnilka
Abstract:
The aortic heart valve stents should fulfill many criterions. These criteria have a strong impact on the geometrical shape of the stent. Usually, the final construction of stent is a result of many year experience and knowledge. Depending on patents claims, different stent shapes are produced by different companies. This causes difficulties for biomechanics engineers narrowing the domain of feasible solutions. The paper present optimization method for stent geometry defining by a specific analytical equation based on various mathematical functions. This formula was implemented as APDL script language in ANSYS finite element environment. For the purpose of simulation tests, a few parameters were separated from developed equation. The application of the genetic algorithms allows finding the best solution due to selected objective function. Obtained solution takes into account parameters such as radial force, compression ratio and coefficient of expansion on the transverse axial.Keywords: aortic stent, optimization process, geometry, finite element method
Procedia PDF Downloads 28117661 Exploring Non-Native English Language Teachers' Understandings and Attitudes towards the Integration of Intercultural Competence
Authors: Simin Sasani
Abstract:
This study will explore a group of English language teachers’ understanding of intercultural competence to find out if they are aware of the concept and how important it is for them. It will investigate how much they are concerned about the challenges that the learners might face in their intercultural communications and to what extent they can help the learners to overcome the barriers to increase students’ insight into cultural differences. In addition, it will explore how a group of non-native English language teachers define culture in relation to their English language teaching practices. More specifically, the research tries to take the how and why of inclusion of intercultural competence into consideration and how non-native teachers think they can improve their learners’ knowledge and skills in this domain. The study will be conducted in the UK and the participants are eight non-native English language teachers who are currently teaching general English language courses for foreigners. A pilot study have been conducted for this research which its results show three non-native English teachers are aware of the notion although they have not had any formal education about intercultural competence. Their challenges and limitation were also highlighted through interviews and observations.Keywords: English, English language teachers, intercultural communications, intercultural competence, non-natives
Procedia PDF Downloads 46617660 Analysis of Q-Learning on Artificial Neural Networks for Robot Control Using Live Video Feed
Authors: Nihal Murali, Kunal Gupta, Surekha Bhanot
Abstract:
Training of artificial neural networks (ANNs) using reinforcement learning (RL) techniques is being widely discussed in the robot learning literature. The high model complexity of ANNs along with the model-free nature of RL algorithms provides a desirable combination for many robotics applications. There is a huge need for algorithms that generalize using raw sensory inputs, such as vision, without any hand-engineered features or domain heuristics. In this paper, the standard control problem of line following robot was used as a test-bed, and an ANN controller for the robot was trained on images from a live video feed using Q-learning. A virtual agent was first trained in simulation environment and then deployed onto a robot’s hardware. The robot successfully learns to traverse a wide range of curves and displays excellent generalization ability. Qualitative analysis of the evolution of policies, performance and weights of the network provide insights into the nature and convergence of the learning algorithm.Keywords: artificial neural networks, q-learning, reinforcement learning, robot learning
Procedia PDF Downloads 37217659 The Foundation Binary-Signals Mechanics and Actual-Information Model of Universe
Authors: Elsadig Naseraddeen Ahmed Mohamed
Abstract:
In contrast to the uncertainty and complementary principle, it will be shown in the present paper that the probability of the simultaneous occupation event of any definite values of coordinates by any definite values of momentum and energy at any definite instance of time can be described by a binary definite function equivalent to the difference between their numbers of occupation and evacuation epochs up to that time and also equivalent to the number of exchanges between those occupation and evacuation epochs up to that times modulus two, these binary definite quantities can be defined at all point in the time’s real-line so it form a binary signal represent a complete mechanical description of physical reality, the time of these exchanges represent the boundary of occupation and evacuation epochs from which we can calculate these binary signals using the fact that the time of universe events actually extends in the positive and negative of time’s real-line in one direction of extension when these number of exchanges increase, so there exists noninvertible transformation matrix can be defined as the matrix multiplication of invertible rotation matrix and noninvertible scaling matrix change the direction and magnitude of exchange event vector respectively, these noninvertible transformation will be called actual transformation in contrast to information transformations by which we can navigate the universe’s events transformed by actual transformations backward and forward in time’s real-line, so these information transformations will be derived as an elements of a group can be associated to their corresponded actual transformations. The actual and information model of the universe will be derived by assuming the existence of time instance zero before and at which there is no coordinate occupied by any definite values of momentum and energy, and then after that time, the universe begin its expanding in spacetime, this assumption makes the need for the existence of Laplace’s demon who at one moment can measure the positions and momentums of all constituent particle of the universe and then use the law of classical mechanics to predict all future and past of universe’s events, superfluous, we only need for the establishment of our analog to digital converters to sense the binary signals that determine the boundaries of occupation and evacuation epochs of the definite values of coordinates relative to its origin by the definite values of momentum and energy as present events of the universe from them we can predict approximately in high precision it's past and future events.Keywords: binary-signal mechanics, actual-information model of the universe, actual-transformation, information-transformation, uncertainty principle, Laplace's demon
Procedia PDF Downloads 17517658 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 41217657 Computational Approach to Identify Novel Chemotherapeutic Agents against Multiple Sclerosis
Authors: Syed Asif Hassan, Tabrej Khan
Abstract:
Multiple sclerosis (MS) is a chronic demyelinating autoimmune disorder, of the central nervous system (CNS). In the present scenario, the current therapies either do not halt the progression of the disease or have side effects which limit the usage of current Disease Modifying Therapies (DMTs) for a longer period of time. Therefore, keeping the current treatment failure schema, we are focusing on screening novel analogues of the available DMTs that specifically bind and inhibit the Sphingosine1-phosphate receptor1 (S1PR1) thereby hindering the lymphocyte propagation toward CNS. The novel drug-like analogs molecule will decrease the frequency of relapses (recurrence of the symptoms associated with MS) with higher efficacy and lower toxicity to human system. In this study, an integrated approach involving ligand-based virtual screening protocol (Ultrafast Shape Recognition with CREDO Atom Types (USRCAT)) to identify the non-toxic drug like analogs of the approved DMTs were employed. The potency of the drug-like analog molecules to cross the Blood Brain Barrier (BBB) was estimated. Besides, molecular docking and simulation using Auto Dock Vina 1.1.2 and GOLD 3.01 were performed using the X-ray crystal structure of Mtb LprG protein to calculate the affinity and specificity of the analogs with the given LprG protein. The docking results were further confirmed by DSX (DrugScore eXtented), a robust program to evaluate the binding energy of ligands bound to the ligand binding domain of the Mtb LprG lipoprotein. The ligand, which has a higher hypothetical affinity, also has greater negative value. Further, the non-specific ligands were screened out using the structural filter proposed by Baell and Holloway. Based on the USRCAT, Lipinski’s values, toxicity and BBB analysis, the drug-like analogs of fingolimod and BG-12 showed that RTL and CHEMBL1771640, respectively are non-toxic and permeable to BBB. The successful docking and DSX analysis showed that RTL and CHEMBL1771640 could bind to the binding pocket of S1PR1 receptor protein of human with greater affinity than as compared to their parent compound (Fingolimod). In this study, we also found that all the drug-like analogs of the standard MS drugs passed the Bell and Holloway filter.Keywords: antagonist, binding affinity, chemotherapeutics, drug-like, multiple sclerosis, S1PR1 receptor protein
Procedia PDF Downloads 25617656 Applying Lean Six Sigma in an Emergency Department, of a Private Hospital
Authors: Sarah Al-Lumai, Fatima Al-Attar, Nour Jamal, Badria Al-Dabbous, Manal Abdulla
Abstract:
Today, many commonly used Industrial Engineering tools and techniques are being used in hospitals around the world for the goal of producing a more efficient and effective healthcare system. A common quality improvement methodology known as Lean Six-Sigma has been successful in manufacturing industries and recently in healthcare. The objective of our project is to use the Lean Six-Sigma methodology to reduce waiting time in the Emergency Department (ED), in a local private hospital. Furthermore, a comprehensive literature review was conducted to evaluate the success of Lean Six-Sigma in the ED. According to the study conducted by Ibn Sina Hospital, in Morocco, the most common problem that patients complain about is waiting time. To ensure patient satisfaction many hospitals such as North Shore University Hospital were able to reduce waiting time up to 37% by using Lean Six-Sigma. Other hospitals, such as John Hopkins’s medical center used Lean Six-Sigma successfully to enhance the overall patient flow that ultimately decreased waiting time. Furthermore, it was found that capacity constraints, such as staff shortages and lack of beds were one of the main reasons behind long waiting time. With the use of Lean Six-Sigma and bed management, hospitals like Memorial Hermann Southwest Hospital were able to reduce patient delays. Moreover, in order to successfully implement Lean Six-Sigma in our project, two common methodologies were considered, DMAIC and DMADV. After the assessment of both methodologies, it was found that DMAIC was a more suitable approach to our project because it is more concerned with improving an already existing process. With many of its successes, Lean Six-Sigma has its limitation especially in healthcare; but limitations can be minimized if properly approached.Keywords: lean six sigma, DMAIC, hospital, methodology
Procedia PDF Downloads 49617655 Electronic Media and Physical Activity of Primary School Children
Authors: Srna Jenko Miholic, Marta Borovec, Josipa Persun
Abstract:
The constant expansion of technology has further accelerated the development of media and vice versa. Although its promotion includes all kinds of interesting and positive sides, the poor functioning of the media is still being researched and proven. Young people, as well as children from the earliest age, resort to the media the most, so it is necessary to defend the role of adults as it were parents, teachers, and environment against virtual co-educators such as the media. The research aim of this study was to determine the time consumption of using electronic media by primary school children as well as their involvement in certain physical activities. Furthermore, to determine what is happening when parents restrict their children's access to electronic media and encourage them to participate in alternative contents during their leisure time. Result reveals a higher percentage of parents restrict their children's access to electronic media and then encourage children to socialize with family and friends, spend time outdoors, engage in physical activity, read books or learn something unrelated to school content even though it would not be children's favorite activity. The results highlight the importance of parental control when it comes to children's use of electronic media and the positive effects that parental control has in terms of encouraging children to be useful, socially desirable, physically active, and healthy activities.Keywords: elementary school, digital media, leisure time, parents, physical engagement
Procedia PDF Downloads 14717654 The Fast Diagnosis of Acanthamoeba Keratitis Using Real-Time PCR Assay
Authors: Fadime Eroglu
Abstract:
Acanthamoeba genus belongs to kingdom protozoa, and it is known as free-living amoebae. Acanthamoeba genus has been isolated from human bodies, swimming pools, bottled mineral water, contact lens solutions, dust, and soil. The members of the genus Acanthamoeba causes Acanthamoeba Keratitis which is a painful sight-threatening disease of the eyes. In recent years, the prevalence of Acanthamoeba keratitis has been high rate reported. The eight different Acanthamoeba species are known to be effective in Acanthamoeba keratitis. These species are Acanthamoeba castellanii, Acanthamoeba polyphaga, Acanthamoeba griffini, Acanthamoeba hatchetti, Acanthamoeba culbertsoni and Acanhtamoeba rhysodes. The conventional diagnosis of Acanthamoeba Keratitis has relied on cytological preparations and growth of Acanthamoeba in culture. However molecular methods such as real-time PCR has been found to be more sensitive. The real-time PCR has now emerged as an effective method for more rapid testing for the diagnosis of infectious disease in decade. Therefore, a real-time PCR assay for the detection of Acanthamoeba keratitis and Acanthamoeba species have been developed in this study. The 18S rRNA sequences from Acanthamoeba species were obtained from National Center for Biotechnology Information and sequences were aligned with MEGA 6 programme. Primers and probe were designed using Custom Primers-OligoPerfectTMDesigner (ThermoFisherScientific, Waltham, MA, USA). They were also assayed for hairpin formation and degree of primer-dimer formation with Multiple Primer Analyzer ( ThermoFisherScientific, Watham, MA, USA). The eight different ATCC Acanthamoeba species were obtained, and DNA was extracted using the Qiagen Mini DNA extraction kit (Qiagen, Hilden, Germany). The DNA of Acanthamoeba species were analyzed using newly designed primer and probe set in real-time PCR assay. The early definitive laboratory diagnosis of Acanthamoeba Keratitis and the rapid initiation of suitable therapy is necessary for clinical prognosis. The results of the study have been showed that new primer and probes could be used for detection and distinguish for Acanthamoeba species. These new developing methods are helpful for diagnosis of Acanthamoeba Keratitis.Keywords: Acathamoeba Keratitis, Acanthamoeba species, fast diagnosis, Real-Time PCR
Procedia PDF Downloads 12017653 Comparative Analysis of Different Land Use Land Cover (LULC) Maps in WRF Modelling Over Indian Region
Authors: Sen Tanmoy, Jain Sarika, Panda Jagabandhu
Abstract:
The studies regarding the impact of urbanization using the WRF-ARW model rely heavily on the static geographical information selected, including domain configuration and land use land cover (LULC) data. Accurate representation of LULC data provides essential information for understanding urban growth and simulating meteorological parameters such as temperature, precipitation etc. Researchers are using different LULC data as per availability and their requirements. As far as India is concerned, we have very limited resources and data availability. So, it is important to understand how we can optimize our results using limited LULC data. In this review article, we explored how a LULC map is generated from different sources in the Indian context and what its significance is in WRF-ARW modeling to study urbanization/Climate change or any other meteorological parameters. Bibliometric analyses were also performed in this review article based on countries of study and indexed keywords. Finally, some key points are marked out for selecting the most suitable LULC map for any urbanization-related study.Keywords: LULC, LULC mapping, LANDSAT, WRF-ARW, ISRO, bibliometric Analysis.
Procedia PDF Downloads 2717652 Designing an App to Solve Surveying Challenges
Authors: Ali Mohammadi
Abstract:
Forming and equipping the surveyors team for construction projects such as dams, roads, and tunnels is always one of the first challenges and hiring surveyors who are proficient in reading maps and controlling structures, purchasing appropriate surveying equipment that the employer can find Also, using methods that can save time, in the bigger the project, the more these challenges show themselves. Finding a surveyor engineer who can lead the teams and train surveyors of the collection and buy TOTAL STATION according to the company's budget and the surveyors' ability to use them and the time available to each team In the following, we will introduce a surveying app and examine how to use it, which shows how useful it can be for surveyors in projects.Keywords: DTM CUTFILL, datatransfer, section, tunnel, traverse
Procedia PDF Downloads 8217651 Knowledge Creation and Diffusion Dynamics under Stable and Turbulent Environment for Organizational Performance Optimization
Authors: Jessica Gu, Yu Chen
Abstract:
Knowledge Management (KM) is undoubtable crucial to organizational value creation, learning, and adaptation. Although the rapidly growing KM domain has been fueled with full-fledged methodologies and technologies, studies on KM evolution that bridge the organizational performance and adaptation to the organizational environment are still rarely attempted. In particular, creation (or generation) and diffusion (or share/exchange) of knowledge are of the organizational primary concerns on the problem-solving perspective, however, the optimized distribution of knowledge creation and diffusion endeavors are still unknown to knowledge workers. This research proposed an agent-based model of knowledge creation and diffusion in an organization, aiming at elucidating how the intertwining knowledge flows at microscopic level lead to optimized organizational performance at macroscopic level through evolution, and exploring what exogenous interventions by the policy maker and endogenous adjustments of the knowledge workers can better cope with different environmental conditions. With the developed model, a series of simulation experiments are conducted. Both long-term steady-state and time-dependent developmental results on organizational performance, network and structure, social interaction and learning among individuals, knowledge audit and stocktaking, and the likelihood of choosing knowledge creation and diffusion by the knowledge workers are obtained. One of the interesting findings reveals a non-monotonic phenomenon on organizational performance under turbulent environment while a monotonic phenomenon on organizational performance under a stable environment. Hence, whether the environmental condition is turbulence or stable, the most suitable exogenous KM policy and endogenous knowledge creation and diffusion choice adjustments can be identified for achieving the optimized organizational performance. Additional influential variables are further discussed and future work directions are finally elaborated. The proposed agent-based model generates evidence on how knowledge worker strategically allocates efforts on knowledge creation and diffusion, how the bottom-up interactions among individuals lead to emerged structure and optimized performance, and how environmental conditions bring in challenges to the organization system. Meanwhile, it serves as a roadmap and offers great macro and long-term insights to policy makers without interrupting the real organizational operation, sacrificing huge overhead cost, or introducing undesired panic to employees.Keywords: knowledge creation, knowledge diffusion, agent-based modeling, organizational performance, decision making evolution
Procedia PDF Downloads 24117650 Application of Simulation of Discrete Events in Resource Management of Massive Concreting
Authors: Mohammad Amin Hamedirad, Seyed Javad Vaziri Kang Olyaei
Abstract:
Project planning and control are one of the most critical issues in the management of construction projects. Traditional methods of project planning and control, such as the critical path method or Gantt chart, are not widely used for planning projects with discrete and repetitive activities, and one of the problems of project managers is planning the implementation process and optimal allocation of its resources. Massive concreting projects is also a project with discrete and repetitive activities. This study uses the concept of simulating discrete events to manage resources, which includes finding the optimal number of resources considering various limitations such as limitations of machinery, equipment, human resources and even technical, time and implementation limitations using analysis of resource consumption rate, project completion time and critical points analysis of the implementation process. For this purpose, the concept of discrete-event simulation has been used to model different stages of implementation. After reviewing the various scenarios, the optimal number of allocations for each resource is finally determined to reach the maximum utilization rate and also to reduce the project completion time or reduce its cost according to the existing constraints. The results showed that with the optimal allocation of resources, the project completion time could be reduced by 90%, and the resulting costs can be reduced by up to 49%. Thus, allocating the optimal number of project resources using this method will reduce its time and cost.Keywords: simulation, massive concreting, discrete event simulation, resource management
Procedia PDF Downloads 14817649 A Virtual Electrode through Summation of Time Offset Pulses
Authors: Isaac Cassar, Trevor Davis, Yi-Kai Lo, Wentai Liu
Abstract:
Retinal prostheses have been successful in eliciting visual responses in implanted subjects. As these prostheses progress, one of their major limitations is the need for increased resolution. As an alternative to increasing the number of electrodes, virtual electrodes may be used to increase the effective resolution of current electrode arrays. This paper presents a virtual electrode technique based upon time-offsets between stimuli. Two adjacent electrodes are stimulated with identical pulses with too short of pulse widths to activate a neuron, but one has a time offset of one pulse width. A virtual electrode of twice the pulse width was then shown to appear in the center, with a total width capable of activating a neuron. This can be used in retinal implants by stimulating electrodes with pulse widths short enough to not elicit responses in neurons, but with their combined pulse width adequate to activate a neuron in between them.Keywords: electrical stimulation, neuroprosthesis, retinal implant, retinal prosthesis, virtual electrode
Procedia PDF Downloads 30317648 Preserved Relative Differences between Regions of Different Thermal Scans
Authors: Tahir Majeed, Michael Handschuh, René Meier
Abstract:
Rheumatoid arthritis patients have swelling and pain at the joints of the hand. The regions where the patient feels pain also show increased body temperature. Thermal cameras can be used to detect the rise in temperature of the affected regions. To monitor the disease progression of rheumatoid arthritis patients, they must visit the clinic regularly for scanning and examination. After scanning and evaluation, the dosage of the medicine is regulated accordingly. To monitor the disease progression over time, the correlation between the images between different visits must be established. It has been observed that by using low-cost thermal cameras, the thermal measurements do not remain the same over time, even within a single scanning. In some situations, temperatures can vary as much as 2°C within the same scanning sequence. In this paper, it has been shown that although the absolute temperature varies over time, the relative difference between the different regions remains similar. Results have been computed over four scanning sequences and are presented.Keywords: relative thermal difference, rheumatoid arthritis, thermal imaging, thermal sensors
Procedia PDF Downloads 19617647 Estimating Poverty Levels from Satellite Imagery: A Comparison of Human Readers and an Artificial Intelligence Model
Authors: Ola Hall, Ibrahim Wahab, Thorsteinn Rognvaldsson, Mattias Ohlsson
Abstract:
The subfield of poverty and welfare estimation that applies machine learning tools and methods on satellite imagery is a nascent but rapidly growing one. This is in part driven by the sustainable development goal, whose overarching principle is that no region is left behind. Among other things, this requires that welfare levels can be accurately and rapidly estimated at different spatial scales and resolutions. Conventional tools of household surveys and interviews do not suffice in this regard. While they are useful for gaining a longitudinal understanding of the welfare levels of populations, they do not offer adequate spatial coverage for the accuracy that is needed, nor are their implementation sufficiently swift to gain an accurate insight into people and places. It is this void that satellite imagery fills. Previously, this was near-impossible to implement due to the sheer volume of data that needed processing. Recent advances in machine learning, especially the deep learning subtype, such as deep neural networks, have made this a rapidly growing area of scholarship. Despite their unprecedented levels of performance, such models lack transparency and explainability and thus have seen limited downstream applications as humans generally are apprehensive of techniques that are not inherently interpretable and trustworthy. While several studies have demonstrated the superhuman performance of AI models, none has directly compared the performance of such models and human readers in the domain of poverty studies. In the present study, we directly compare the performance of human readers and a DL model using different resolutions of satellite imagery to estimate the welfare levels of demographic and health survey clusters in Tanzania, using the wealth quintile ratings from the same survey as the ground truth data. The cluster-level imagery covers all 608 cluster locations, of which 428 were classified as rural. The imagery for the human readers was sourced from the Google Maps Platform at an ultra-high resolution of 0.6m per pixel at zoom level 18, while that of the machine learning model was sourced from the comparatively lower resolution Sentinel-2 10m per pixel data for the same cluster locations. Rank correlation coefficients of between 0.31 and 0.32 achieved by the human readers were much lower when compared to those attained by the machine learning model – 0.69-0.79. This superhuman performance by the model is even more significant given that it was trained on the relatively lower 10-meter resolution satellite data while the human readers estimated welfare levels from the higher 0.6m spatial resolution data from which key markers of poverty and slums – roofing and road quality – are discernible. It is important to note, however, that the human readers did not receive any training before ratings, and had this been done, their performance might have improved. The stellar performance of the model also comes with the inevitable shortfall relating to limited transparency and explainability. The findings have significant implications for attaining the objective of the current frontier of deep learning models in this domain of scholarship – eXplainable Artificial Intelligence through a collaborative rather than a comparative framework.Keywords: poverty prediction, satellite imagery, human readers, machine learning, Tanzania
Procedia PDF Downloads 10617646 Implementation of Real-Time Multiple Sound Source Localization and Separation
Authors: Jeng-Shin Sheu, Qi-Xun Zheng
Abstract:
This paper mainly discusses a method of separating speech when using a microphone array without knowing the number and direction of sound sources. In recent years, there have been many studies on the method of separating signals by using masking, but most of the separation methods must be operated under the condition of a known number of sound sources. Such methods cannot be used for real-time applications. In our method, this paper uses Circular-Integrated-Cross-Spectrum to estimate the statistical histogram distribution of the direction of arrival (DOA) to obtain the number of sound sources and sound in the mixed-signal Source direction. In calculating the relevant parameters of the ring integrated cross-spectrum, the phase (Phase of the Cross-Power Spectrum) and phase rotation factors (Phase Rotation Factors) calculated by the cross power spectrum of each microphone pair are used. In the part of separating speech, it uses the DOA weighting and shielding separation method to calculate the sound source direction (DOA) according to each T-F unit (time-frequency point). The weight corresponding to each T-F unit can be used to strengthen the intensity of each sound source from the T-F unit and reduce the influence of the remaining sound sources, thereby achieving voice separation.Keywords: real-time, spectrum analysis, sound source localization, sound source separation
Procedia PDF Downloads 15517645 Integration of Fuzzy Logic in the Representation of Knowledge: Application in the Building Domain
Authors: Hafida Bouarfa, Mohamed Abed
Abstract:
The main object of our work is the development and the validation of a system indicated Fuzzy Vulnerability. Fuzzy Vulnerability uses a fuzzy representation in order to tolerate the imprecision during the description of construction. At the the second phase, we evaluated the similarity between the vulnerability of a new construction and those of the whole of the historical cases. This similarity is evaluated on two levels: 1) individual similarity: bases on the fuzzy techniques of aggregation; 2) Global similarity: uses the increasing monotonous linguistic quantifiers (RIM) to combine the various individual similarities between two constructions. The third phase of the process of Fuzzy Vulnerability consists in using vulnerabilities of historical constructions narrowly similar to current construction to deduce its estimate vulnerability. We validated our system by using 50 cases. We evaluated the performances of Fuzzy Vulnerability on the basis of two basic criteria, the precision of the estimates and the tolerance of the imprecision along the process of estimation. The comparison was done with estimates made by tiresome and long models. The results are satisfactory.Keywords: case based reasoning, fuzzy logic, fuzzy case based reasoning, seismic vulnerability
Procedia PDF Downloads 29217644 Users’ Information Disclosure Determinants in Social Networking Sites: A Systematic Literature Review
Authors: Wajdan Al Malwi, Karen Renaud, Lewis Mackenzie
Abstract:
The privacy paradox describes a phenomenon whereby there is no connection between stated privacy concerns and privacy behaviours. We need to understand the underlying reasons for this paradox if we are to help users to preserve their privacy more effectively. In particular, the Social Networking System (SNS) domain offers a rich area of investigation due to the risks of unwise information disclosure decisions. Our study thus aims to untangle the complicated nature and underlying mechanisms of online privacy-related decisions in SNSs. In this paper, we report on the findings of a Systematic Literature Review (SLR) that revealed a number of factors that are likely to influence online privacy decisions. Our deductive analysis approach was informed by Communicative Privacy Management (CPM) theory. We uncovered a lack of clarity around privacy attitudes and their link to behaviours, which makes it challenging to design privacy-protecting SNS platforms and to craft legislation to ensure that users’ privacy is preserved.Keywords: privacy paradox, self-disclosure, privacy attitude, privacy behavior, social networking sites
Procedia PDF Downloads 15517643 Degradation of Chlorpyrifos Pesticide in Aqueous Solution and Chemical Oxygen Demand from Real Effluent with Hydrodynamic Cavitation Approach
Authors: Shrikant Randhavane, Anjali Khambete
Abstract:
Use of Pesticides is vital in attaining food security and protection from harmful pests and insects in living environment. Chlorpyrifos, an organophosphate pesticide is widely used worldwide for various purposes. Due to its wide use and applications, its residues are found in environmental matrices and persist in nature for long duration of time. This has an adverse effect on human, aquatic and living bodies. Use of different methodologies is need of an hour to treat such type of recalcitrant compound. The paper focuses on Hydrodynamic Cavitation (HC), a hybrid Advanced Oxidation Potential (AOP) method to degrade Chlorpyrifos in aqueous water. Obtained results show that optimum inlet pressure of 5 bars gave maximum degradation of 99.25% for lower concentration and 87.14% for higher concentration Chlorpyrifos solution in 1 hour treatment time. Also, with known initial concentrations, comparing treatment time with optimum pressure of 5 bars, degradation efficiency increases with Hydrodynamic Cavitation. The potential application of HC in removal of Chemical Oxygen Demand (COD) from real effluent with venturi as cavitating device reveals around 40% COD removal with 1 hour of treatment time.Keywords: advanced oxidation potential, cavitation, chlorpyrifos, COD
Procedia PDF Downloads 21917642 Real-Time Image Encryption Using a 3D Discrete Dual Chaotic Cipher
Authors: M. F. Haroun, T. A. Gulliver
Abstract:
In this paper, an encryption algorithm is proposed for real-time image encryption. The scheme employs a dual chaotic generator based on a three dimensional (3D) discrete Lorenz attractor. Encryption is achieved using non-autonomous modulation where the data is injected into the dynamics of the master chaotic generator. The second generator is used to permute the dynamics of the master generator using the same approach. Since the data stream can be regarded as a random source, the resulting permutations of the generator dynamics greatly increase the security of the transmitted signal. In addition, a technique is proposed to mitigate the error propagation due to the finite precision arithmetic of digital hardware. In particular, truncation and rounding errors are eliminated by employing an integer representation of the data which can easily be implemented. The simple hardware architecture of the algorithm makes it suitable for secure real-time applications.Keywords: chaotic systems, image encryption, non-autonomous modulation, FPGA
Procedia PDF Downloads 50617641 Paradigms of Assessment, Valuation and Quantification to Trade Ecosystem Services: A Review Focusing on Mangroves and Wetlands
Authors: Rama Seth, Luise Noring, Pratim Majumdar
Abstract:
Based on an extensive literature review, this paper presents distinct approaches to value, quantify and trade ecosystem services, with particular emphasis on services provided by mangroves and wetlands. Building on diverse monetary and market-based systems for the improved allocation of natural resources, such trading and exchange-based methods can help tackle the degradation of ecosystem services in a more targeted and structured manner than achievable with stand-alone policy and administrative regulations. Using various threads of literature, the paper proposes a platform that serves as the skeletal foundation for developing an efficient global market for ecosystem services trading. The paper bridges a significant research and practice gap by recommending how to establish an equilibrium in the biosphere via trading mechanisms while also discovering other research gaps and future research potential in the domain of ecosystem valuation.Keywords: environment, economics, mangroves, wetlands, markets, ESG, global capital, climate investments, valuation, ecosystem services
Procedia PDF Downloads 251