Search results for: key agreement protocol
2416 A Car Parking Monitoring System Using a Line-Topology Wireless Sensor Network
Authors: Dae Il Kim, Jungho Moon, Tae Yun Chung
Abstract:
This paper presents a car parking monitoring system using a wireless sensor network. The presented sensor network has a line-shaped topology and adopts a TDMA-based protocol for allowing multi-hop communications. Sensor nodes are deployed in the ground of an outdoor parking lot in such a way that a sensor node monitors a parking space. Each sensor node detects the availability of the associated parking space and transmits the detection result to a sink node via intermediate sensor nodes existing between the source sensor node and the sink node. We evaluate the feasibility of the presented sensor network and the TDMA-based communication protocol through experiments using 11 sensor nodes deployed in a real parking lot. The result shows that the presented car parking monitoring system is robust to changes in the communication environments and efficient for monitoring parking spaces of outdoor parking lots.Keywords: multi-hop communication, parking monitoring system, TDMA, wireless sensor network
Procedia PDF Downloads 3032415 Comparison of Computed Tomography Dose Index, Dose Length Product and Effective Dose Among Male and Female Patients From Contrast Enhanced Computed Tomography Pancreatitis Protocol
Authors: Babina Aryal
Abstract:
Background: The diagnosis of pancreatitis is generally based on clinical and laboratory findings; however, Computed Tomography (CT) is an imaging technique of choice specially Contrast Enhanced Computed Tomography (CECT) shows morphological characteristic findings that allow for establishing the diagnosis of pancreatitis and determining the extent of disease severity which is done along with the administration of appropriate contrast medium. The purpose of this study was to compare Computed Tomography Dose Index (CTDI), Dose Length Product (DLP) and Effective Dose (ED) among male and female patients from Contrast Enhanced Computed Tomography (CECT) Pancreatitis Protocol. Methods: This retrospective study involved data collection based on clinical/laboratory/ultrasonography diagnosis of Pancreatitis and has undergone CECT Abdomen pancreatitis protocol. data collection involved detailed information about a patient's Age and Gender, Clinical history, Individual Computed Tomography Dose Index and Dose Length Product and effective dose. Results: We have retrospectively collected dose data from 150 among which 127 were males and 23 were females. The values obtained from the display of the CT screen were measured, calculated and compared to determine whether the CTDI, DLP and ED values were similar or not. CTDI for females was more as compared to males. The differences in CTDI values for females and males were 32.2087 and 37.1609 respectively. DLP values and Effective dose for both the genders did not show significant differences. Conclusion: This study concluded that there were no more significant changes in the DLP and ED values among both the genders however we noticed that female patients had more CTDI than males.Keywords: computed tomography, contrast enhanced computed tomography, computed tomography dose index, dose length product, effective dose
Procedia PDF Downloads 1182414 Nursing System Development in Patients Undergoing Operation in 3C Ward
Authors: Darawan Augsornwan, Artitaya Sabangbal, Maneewan Srijan, Kanokarn Kongpitee, Lalida Petphai, Palakorn Surakunprapha
Abstract:
Background: Srinagarind Hospital, Ward 3C, has patients with head and neck cancer, congenital urology anomalies such as hypospadis, cleft lip and cleft palate and congenital megacolon who need surgery. Undergoing surgery is a difficult time for patients/ family; they feel fear and anxiety. Nurses work closely with patients and family for 24 hours in the process of patients care, so should have the good nursing ability, innovation and an efficient nursing care system to promote patients self-care ability reducing suffering and preventing complications. From previous nursing outcomes we found patients did not receive appropriate information, could not take care of their wound, not early ambulation after the operation and lost follow-up. Objective: to develop the nursing system for patients who were undergoing an operation. Method: this is a participation action research. The sample population was 11 nurses and 60 patients. This study was divided into 3 phase: Phase 1. Situation review In this phase we review the clinical outcomes, the process of care from documents such as nurses note and interview nurses, patients and family about the process of care by nurses. Phase 2: focus group with 11 nurses, searching guideline for specific care, nursing care system then establish the protocol. This phase we have the protocol for giving information, teaching protocol and teaching record, leaflet for all of top five diseases, make video media to convey information, ambulation package and protocol for patients with head and neck cancer, patients zoning, primary nurse, improved job description for each staff level. Program to record number of patients, kind of medical procedures for showing nurses activity each day. Phase 3 implementation and evaluation. Result: patients/family receive appropriate information about deep breathing exercise, cough, early ambulation after the operation, information during the stay in the hospital. Patients family satisfaction is 95.04 percent, appropriate job description for a practical nurse, nurse aid, and worker. Nurses satisfaction is 95 percent. The complications can be prevented. Conclusion: the nursing system is the dynamic process using evidence to develop nursing care. The appropriate system depends on context and needs to keep an eye on every event.Keywords: development, nursing system, patients undergoing operation, 3C Ward
Procedia PDF Downloads 2652413 A Wall Law for Two-Phase Turbulent Boundary Layers
Authors: Dhahri Maher, Aouinet Hana
Abstract:
The presence of bubbles in the boundary layer introduces corrections into the log law, which must be taken into account. In this work, a logarithmic wall law was presented for bubbly two phase flows. The wall law presented in this work was based on the postulation of additional turbulent viscosity associated with bubble wakes in the boundary layer. The presented wall law contained empirical constant accounting both for shear induced turbulence interaction and for non-linearity of bubble. This constant was deduced from experimental data. The wall friction prediction achieved with the wall law was compared to the experimental data, in the case of a turbulent boundary layer developing on a vertical flat plate in the presence of millimetric bubbles. A very good agreement between experimental and numerical wall friction prediction was verified. The agreement was especially noticeable for the low void fraction when bubble induced turbulence plays a significant role.Keywords: bubbly flows, log law, boundary layer, CFD
Procedia PDF Downloads 2782412 An Implementation of a Configurable UART-to-Ethernet Converter
Authors: Jungho Moon, Myunggon Yoon
Abstract:
This paper presents an implementation of a configurable UART-to-Ethernet converter using an ARM-based 32-bit microcontroller as well as a dedicated configuration program running on a PC for configuring the operating parameters of the converter. The program was written in Python. Various parameters pertaining to the operation of the converter can be modified by the configuration program through the Ethernet interface of the converter. The converter supports 3 representative asynchronous serial communication protocols, RS-232, RS-422, and RS-485 and supports 3 network modes, TCP/IP server, TCP/IP client, and UDP client. The TCP/IP and UDP protocols were implemented on the microcontroller using an open source TCP/IP protocol stack called lwIP (A lightweight TCP/IP) and FreeRTOS, a free real-time operating system for embedded systems. Due to the use of a real-time operating system, the firmware of the converter was implemented as a multi-thread application and as a result becomes more modular and easier to develop. The converter can provide a seamless bridge between a serial port and an Ethernet port, thereby allowing existing legacy apparatuses with no Ethernet connectivity to communicate using the Ethernet protocol.Keywords: converter, embedded systems, ethernet, lwIP, UART
Procedia PDF Downloads 7082411 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks
Authors: Ather Saeed, Arif Khan, Jeffrey Gosper
Abstract:
Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering
Procedia PDF Downloads 772410 Comparison of Two Transcranial Magnetic Stimulation Protocols on Spasticity in Multiple Sclerosis - Pilot Study of a Randomized and Blind Cross-over Clinical Trial
Authors: Amanda Cristina da Silva Reis, Bruno Paulino Venâncio, Cristina Theada Ferreira, Andrea Fialho do Prado, Lucimara Guedes dos Santos, Aline de Souza Gravatá, Larissa Lima Gonçalves, Isabella Aparecida Ferreira Moretto, João Carlos Ferrari Corrêa, Fernanda Ishida Corrêa
Abstract:
Objective: To compare two protocols of Transcranial Magnetic Stimulation (TMS) on quadriceps muscle spasticity in individuals diagnosed with Multiple Sclerosis (MS). Method: Clinical, crossover study, in which six adult individuals diagnosed with MS and spasticity in the lower limbs were randomized to receive one session of high-frequency (≥5Hz) and low-frequency (≤ 1Hz) TMS on motor cortex (M1) hotspot for quadriceps muscle, with a one-week interval between the sessions. To assess the spasticity was applied the Ashworth scale and were analyzed the latency time (ms) of the motor evoked potential (MEP) and the central motor conduction time (CMCT) of the bilateral quadriceps muscle. Assessments were performed before and after each intervention. The difference between groups was analyzed using the Friedman test, with a significance level of 0.05 adopted. Results: All statistical analyzes were performed using the SPSS Statistic version 26 programs, with a significance level established for the analyzes at p<0.05. Shapiro Wilk normality test. Parametric data were represented as mean and standard deviation for non-parametric variables, median and interquartile range, and frequency and percentage for categorical variables. There was no clinical change in quadriceps spasticity assessed using the Ashworth scale for the 1 Hz (p=0.813) and 5 Hz (p= 0.232) protocols for both limbs. Motor Evoked Potential latency time: in the 5hz protocol, there was no significant change for the contralateral side from pre to post-treatment (p>0.05), and for the ipsilateral side, there was a decrease in latency time of 0.07 seconds (p<0.05 ); for the 1Hz protocol there was an increase of 0.04 seconds in the latency time (p<0.05) for the contralateral side to the stimulus, and for the ipsilateral side there was a decrease in the latency time of 0.04 seconds (p=<0.05), with a significant difference between the contralateral (p=0.007) and ipsilateral (p=0.014) groups. Central motor conduction time in the 1Hz protocol, there was no change for the contralateral side (p>0.05) and for the ipsilateral side (p>0.05). In the 5Hz protocol for the contralateral side, there was a small decrease in latency time (p<0.05) and for the ipsilateral side, there was a decrease of 0.6 seconds in the latency time (p<0.05) with a significant difference between groups (p=0.019). Conclusion: A high or low-frequency session does not change spasticity, but it is observed that when the low-frequency protocol was performed, there was an increase in latency time on the stimulated side, and a decrease in latency time on the non-stimulated side, considering then that inhibiting the motor cortex increases cortical excitability on the opposite side.Keywords: multiple sclerosis, spasticity, motor evoked potential, transcranial magnetic stimulation
Procedia PDF Downloads 912409 Discourse Analysis of the Perception of ‘Safety’ in EU and Refugee Law
Authors: Klaudia Krogulec
Abstract:
The concept and the meaning of safety is largely undermined in International and EU refugee law. While the Geneva Convention 1951 concentrates mainly on the principle of non-refoulment (no-return) and the idea of physical safety of refugees, countries continue to implement harmful readmission agreements that presume ‘safe countries’ for the hosting and return of the refugees. This research intends to use discourse analysis of the legal provisions and interviews with Syrian refugees, NGO workers, and refugee lawyers in Tukey to understand what ‘safety’ actually means and how law shapes the experiences of Syrians in Turkey (the country that hosts the largest population of Syrians and is a key partner of the EU-Turkey Agreement 2016). The preliminary findings reveal the competing meanings of safety (rights-based vs state interests approach). As the refugee policies continue to prioritize state interests/safety over human safety and human rights, it is extremely important to provide recommendations on how ‘safety’ should be defined in the refugee law in the future.Keywords: human rights law, refugee law, human safety, EU-turkey agreement
Procedia PDF Downloads 1622408 The Correlation between Nasal Resistance and Obligatory Oronasal Switching Point in Non-Athletic Non-Smoking Healthy Men
Authors: Amir H. Bayat, Mohammad R. Alipour, Saeed Khamneh
Abstract:
As the respiration via nose is important physiologically, many studies have been done about nasal breathing that switches to oronasal breathing during exercise. The aim of this study was to assess the role of anterior nasal resistance as one of the effective factors on this switching. Twelve young, healthy, non-athletic and non-smoker male volunteers with normal BMI were selected after physical examination and participated in exercise protocol, including measurement of the ventilation, work load and oronasal switching point (OSP) during exercise, and anterior rhinomanometry at rest. The protocol was an incremental exercise with 25 watt increase in work load per minute up to OSP occurrence. There was a significant negative correlation between resting total anterior nasal resistance with OSP, work load and ventilation (p<0.05, r= -0.709). Resting total anterior nasal resistance can be considered as an important factor on OSP occurrence. So, the reducing the resistance of nasal passage may increase nasal respiration tolerance for longer time during exercise.Keywords: anterior nasal resistance, exercise, OSP, ventilation, work load
Procedia PDF Downloads 4032407 Validation of the Recovery of House Dust Mites from Fabrics by Means of Vacuum Sampling
Authors: A. Aljohani, D. Burke, D. Clarke, M. Gormally, M. Byrne, G. Fleming
Abstract:
Introduction: House Dust Mites (HDMs) are a source of allergen particles embedded in textiles and furnishings. Vacuum sampling is commonly used to recover and determine the abundance of HDMs but the efficiency of this method is less than standardized. Here, the efficiency of recovery of HDMs was evaluated from home-associated textiles using vacuum sampling protocols.Methods/Approach: Living Mites (LMs) or dead Mites (DMs) House Dust Mites (Dermatophagoides pteronyssinus: FERA, UK) were separately seeded onto the surfaces of Smooth Cotton, Denim and Fleece (25 mites/10x10cm2 squares) and left for 10 minutes before vacuuming. Fabrics were vacuumed (SKC Flite 2 pump) at a flow rate of 14 L/min for 60, 90 or 120 seconds and the number of mites retained by the filter (0.4μm x 37mm) unit was determined. Vacuuming was carried out in a linear direction (Protocol 1) or in a multidirectional pattern (Protocol 2). Additional fabrics with LMs were also frozen and then thawed, thereby euthanizing live mites (now termed EMs). Results/Findings: While there was significantly greater (p=0.000) recovery of mites (76% greater) in fabrics seeded with DMs than LMs irrespective of vacuuming protocol or fabric type, the efficiency of recovery of DMs (72%-76%) did not vary significantly between fabrics. For fabrics containing EMs, recovery was greatest for Smooth Cotton and Denim (65-73% recovered) and least for Fleece (15% recovered). There was no significant difference (p=0.99) between the recovery of mites across all three mite categories from Smooth Cotton and Denim but significantly fewer (p=0.000) mites were recovered from Fleece. Scanning Electron Microscopy images of HMD-seeded fabrics showed that live mites burrowed deeply into the Fleece weave which reduced their efficiency of recovery by vacuuming. Research Implications: Results presented here have implications for the recovery of HDMs by vacuuming and the choice of fabric to ameliorate HDM-dust sensitization.Keywords: allergy, asthma, dead, fabric, fleece, live mites, sampling
Procedia PDF Downloads 1412406 Prediction Study of the Structural, Elastic and Electronic Properties of the Parent and Martensitic Phases of Nonferrous Ti, Zr, and Hf Pure Metals
Authors: Tayeb Chihi, Messaoud Fatmi
Abstract:
We present calculations of the structural, elastic and electronic properties of nonferrous Ti, Zr, and Hf pure metals in both parent and martensite phases in bcc and hcp structures respectively. They are based on the generalized gradient approximation (GGA) within the density functional theory (DFT). The shear modulus, Young's modulus and Poisson's ratio for Ti, Zr, and Hf metals have were calculated and compared with the corresponding experimental values. Using elastic constants obtained from calculations GGA, the bulk modulus along the crystallographic axes of single crystals was calculated. This is in good agreement with experiment for Ti and Zr, whereas the hcp structure for Hf is a prediction. At zero temperature and zero pressure, the bcc crystal structure is found to be mechanically unstable for Ti, Zr, and Hf. In our calculations the hcp structures is correctly found to be stable at the equilibrium volume. In the electronic density of states (DOS), the smaller n(EF) is, the more stable the compound is. Therefore, in agreement with the results obtained from the total energy minimum.Keywords: Ti, Zr, Hf, pure metals, transformation, energy
Procedia PDF Downloads 3552405 Performance Analysis of Bluetooth Low Energy Mesh Routing Algorithm in Case of Disaster Prediction
Authors: Asmir Gogic, Aljo Mujcic, Sandra Ibric, Nermin Suljanovic
Abstract:
Ubiquity of natural disasters during last few decades have risen serious questions towards the prediction of such events and human safety. Every disaster regardless its proportion has a precursor which is manifested as a disruption of some environmental parameter such as temperature, humidity, pressure, vibrations and etc. In order to anticipate and monitor those changes, in this paper we propose an overall system for disaster prediction and monitoring, based on wireless sensor network (WSN). Furthermore, we introduce a modified and simplified WSN routing protocol built on the top of the trickle routing algorithm. Routing algorithm was deployed using the bluetooth low energy protocol in order to achieve low power consumption. Performance of the WSN network was analyzed using a real life system implementation. Estimates of the WSN parameters such as battery life time, network size and packet delay are determined. Based on the performance of the WSN network, proposed system can be utilized for disaster monitoring and prediction due to its low power profile and mesh routing feature.Keywords: bluetooth low energy, disaster prediction, mesh routing protocols, wireless sensor networks
Procedia PDF Downloads 3882404 Determining the Constituents of the Sunnah of Prophet Muhammad (pbuh) in the Light of the Quran: A Clinical Approach
Authors: Aamir I. Yazdani, Dr. Muhammad Nasir J. Qureshi
Abstract:
The term Sunnah has been used both, for Allah Himself and for his messengers in the Quran. The way Allah dealt with people where the messengers (rasuls) were sent is called Sunnatullāh by the Quran. Likewise, the same term is used in the Quran, for Prophet Muhammad (pbuh) as in following the trodden path (Sunnah) of his forefather Prophet Abraham, Alaihissalam. It implies; therefore, the word Sunnah cannot be applied to things which relates to theoretical knowledge like faith etc. Its ambit remains the practices, actions linked to practical things only. In the case of the Quran, we find that there is complete agreement among all Muslims on what constitutes the book of Allah, based on ijma (unanimity, total agreement, consensus) and tawatur (uninterrupted continuity, without any gap). There seems to be no unanimity on the question on what constitutes Sunnah of Prophet Muhammad (pbuh). There are, therefore, several approaches towards Sunnah adopted by Muslims. This paper is based on Qualitative Methodology to determine the criterion of what constitutes the Sunnah of the Prophet Muhammad (pbuh) and which practices constitute the precincts of the Sunnah of the Prophet Muhammad (pbuh).Keywords: Al-hikmah, Hereafter, practices, Tazkiya
Procedia PDF Downloads 1462403 Combating Fake News: A Qualitative Evidence Synthesis of Organizational Stakeholder Trust in Social Media Communication during Crisis
Authors: Todd R. Walton
Abstract:
Social media would seem to be an ideal mechanism for crisis communication, yet it has been met with varied results. Natural disasters, such as hurricanes, provide a slow moving view of how social media can be leveraged to guide stakeholders and the public through a crisis. Crisis communication managers have struggled to reach target audiences with credible messaging. This Qualitative Evidence Synthesis (QES) analyzed the findings of eight studies published in the last year to determine how organizations effectively utilize social media for crisis communication. Additionally, the evidence was analyzed to note strategies for establishing credibility in a medium fraught with misinformation. Studies indicated wide agreement on the use of multiple social media channels in addition to frequent accurate messaging in order to establish credibility. Studies indicated mixed agreement on the use of text based emergency notification systems. The findings in this QES will help crisis communication professionals plan for social media use for crisis communication.Keywords: crisis communication, crisis management, emergency response, social media
Procedia PDF Downloads 2102402 Theoretical and Computational Investigation of PCBM and PC71BM Derivatives using the DFT Method
Authors: Zair Mohammed El Amine, Chemouri Hafida, Derbal Habak Hassina
Abstract:
Organic photovoltaic cells are electronic devices that convert sunlight into electricity. To this end, the number of studies on organic photovoltaic cells (OVCs) is growing, and this trend is expected to continue. Computational studies are still needed to verify and prove the capability of CVOs, specifically the nanometer molecule PCBM, based on successful experimental results. In this paper, we present a theoretical and computational investigation of PCBM and PC71BM derivatives using the DFT method. On this basis, we employ independent and time-dependent density theories. HOMO, LUMO and GAPH-L energies, ionization potentials and electronic affinity are determined and found to be in agreement with experiments. Using DFT theory based on B3LYP and M062X methods with bases 6-31G (d,p) and 6-311G (d), calculations show that the most efficient acceptors are presented in the group of PC71BM derivatives and are in substantial agreement with experiments. The geometries of the structures are optimized by Gaussian 09.Keywords: PCBM, P3HT, organic cell solar, DFT, TD-DFT
Procedia PDF Downloads 872401 A Comparison of Methods for Neural Network Aggregation
Authors: John Pomerat, Aviv Segev
Abstract:
Recently, deep learning has had many theoretical breakthroughs. For deep learning to be successful in the industry, however, there need to be practical algorithms capable of handling many real-world hiccups preventing the immediate application of a learning algorithm. Although AI promises to revolutionize the healthcare industry, getting access to patient data in order to train learning algorithms has not been easy. One proposed solution to this is data- sharing. In this paper, we propose an alternative protocol, based on multi-party computation, to train deep learning models while maintaining both the privacy and security of training data. We examine three methods of training neural networks in this way: Transfer learning, average ensemble learning, and series network learning. We compare these methods to the equivalent model obtained through data-sharing across two different experiments. Additionally, we address the security concerns of this protocol. While the motivating example is healthcare, our findings regarding multi-party computation of neural network training are purely theoretical and have use-cases outside the domain of healthcare.Keywords: neural network aggregation, multi-party computation, transfer learning, average ensemble learning
Procedia PDF Downloads 1632400 The application of Gel Dosimeters and Comparison with other Dosimeters in Radiotherapy: A Literature Review
Authors: Sujan Mahamud
Abstract:
Purpose: A major challenge in radiotherapy treatment is to deliver precise dose of radiation to the tumor with minimum dose to the healthy normal tissues. Recently, gel dosimetry has emerged as a powerful tool to measure three-dimensional (3D) dose distribution for complex delivery verification and quality assurance. These dosimeters act both as a phantom and detector, thus confirming the versatility of dosimetry technique. The aim of the study is to know the application of Gel Dosimeters in Radiotherapy and find out the comparison with 1D and 2D dimensional dosimeters. Methods and Materials: The study is carried out from Gel Dosimeter literatures. Secondary data and images have been collected from different sources such as different guidelines, books, and internet, etc. Result: Analyzing, verifying, and comparing data from treatment planning system (TPS) is determined that gel dosimeter is a very excellent powerful tool to measure three-dimensional (3D) dose distribution. The TPS calculated data were in very good agreement with the dose distribution measured by the ferrous gel. The overall uncertainty in the ferrous-gel dose determination was considerably reduced using an optimized MRI acquisition protocol and a new MRI scanner. The method developed for comparing measuring gel data with calculated treatment plans, the gel dosimetry method, was proven to be a useful for radiation treatment planning verification. In 1D and 2D Film, the depth dose and lateral for RMSD are 1.8% and 2%, and max (Di-Dj) are 2.5% and 8%. Other side 2D+ ( 3D) Film Gel and Plan Gel for RMSDstruct and RMSDstoch are 2.3% & 3.6% and 1% & 1% and system deviation are -0.6% and 2.5%. The study is investigated that the result fined 2D+ (3D) Film Dosimeter is better than the 1D and 2D Dosimeter. Discussion: Gel Dosimeters is quality control and quality assurance tool which will used the future clinical application.Keywords: gel dosimeters, phantom, rmsd, QC, detector
Procedia PDF Downloads 1522399 The Debacle of the Social Pact: Finding a New Theoretical Framework for Egalitarian Justice
Authors: Abosede Priscilla Ipadeola
Abstract:
The quest for egalitarian justice requires a suitable theoretical foundation that can address the problem of marginalization and subjugation arising from various forms of oppression, such as sexism, racism, classism, and others. Many thinkers and societies have appealed to contractarianism, a theory that has been widely regarded as a doctrine of egalitarianism by some political theorists for about five centuries. Despite its numerous criticisms, the social contract still enjoys a prominent status as a key theory for egalitarian justice. However, Pateman and Mills have contended that the contractarian approach legitimizes gender and racial inequalities by excluding and marginalizing women and people of color from the original agreement. Therefore, the social contract is incapable of generating or fostering equality. This study proposes postcontractarianism, which is a viable alternative to the social contract. Postcontractarianism argues that the basis for egalitarianism cannot be grounded on agreement but rather on understanding. Postcontractarianism draws on Jorge Nef’s idea of mutual vulnerability and Obiri (an African theory of cosmology) to argue for the imperative of social equality.Keywords: postcontractarianism, obiri, mutual vulnerability, egalitarianism, the social contract
Procedia PDF Downloads 602398 Curbing Cybercrime by Application of Internet Users’ Identification System (IUIS) in Nigeria
Authors: K. Alese Boniface, K. Adu Michael
Abstract:
Cybercrime is now becoming a big challenge in Nigeria apart from the traditional crime. Inability to identify perpetrators is one of the reasons for the growing menace. This paper proposes a design for monitoring internet users’ activities in order to curbing cybercrime. It requires redefining the operations of Internet Service Providers (ISPs) which will now mandate users to be authenticated before accessing the internet. In implementing this work which can be adapted to a larger scale, a virtual router application is developed and configured to mimic a real router device. A sign-up portal is developed to allow users to register with the ISP. The portal asks for identification information which will include bio-data and government issued identification data like National Identity Card number, et cetera. A unique username and password are chosen by the user to enable access to the internet which will be used to reference him to an Internet Protocol Address (IP Address) of any system he uses on the internet and thereby associating him to any criminal act related to that IP address at that particular time. Questions such as “What happen when another user knows the password and uses it to commit crime?” and other pertinent issues are addressed.Keywords: cybercrime, sign-up portal, internet service provider (ISP), internet protocol address (IP address)
Procedia PDF Downloads 2792397 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality
Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan
Abstract:
Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application
Procedia PDF Downloads 762396 Productivity of Construction Companies Using the Management of Threats and Opportunities in Construction Projects of Iran
Authors: Nima Amani, Ali Salehi Dastjerdi, Fatemeh Ahmadi, Ardalan Sabamehr
Abstract:
The cost overrun of the construction projects has always been one of the main problems of the construction companies caused by the risky nature of the construction projects. Therefore, today, the application of risk management is inevitable. Although in theory, the issue of risk management is divided into the opportunities and threats management, in practice, most of the projects have been focused on the threats management. However, considering the opportunities management and applying the opportunities-response strategies can lead to the improved profitability of the construction projects of the companies. In this paper, a new technique is developed to identify the opportunities in the construction projects using an improved protocol and propose the appropriate opportunities-response strategies to the construction companies to provide them with higher profitability. To evaluate the effectiveness of the protocol for selecting the most appropriate strategies in response to the opportunities and threats, two projects from a construction company in Iran were studied. Both projects selected were in mid-range in terms of size and similar in terms of time, run time and costs. Finally, the output indicates that using the proposed opportunities-response strategies show that the company's profitability in the future can be increased approximately for similar projects.Keywords: opportunities management, risk-response strategy, opportunity-response strategy, productivity, risk management
Procedia PDF Downloads 2302395 Intended Use of Genetically Modified Organisms, Advantages and Disadvantages
Authors: Pakize Ozlem Kurt Polat
Abstract:
GMO (genetically modified organism) is the result of a laboratory process where genes from the DNA of one species are extracted and artificially forced into the genes of an unrelated plant or animal. This technology includes; nucleic acid hybridization, recombinant DNA, RNA, PCR, cell culture and gene cloning techniques. The studies are divided into three groups of properties transferred to the transgenic plant. Up to 59% herbicide resistance characteristic of the transfer, 28% resistance to insects and the virus seems to be related to quality characteristics of 13%. Transgenic crops are not included in the commercial production of each product; mostly commercial plant is soybean, maize, canola, and cotton. Day by day increasing GMO interest can be listed as follows; Use in the health area (Organ transplantation, gene therapy, vaccines and drug), Use in the industrial area (vitamins, monoclonal antibodies, vaccines, anti-cancer compounds, anti -oxidants, plastics, fibers, polyethers, human blood proteins, and are used to produce carotenoids, emulsifiers, sweeteners, enzymes , food preservatives structure is used as a flavor enhancer or color changer),Use in agriculture (Herbicide resistance, Resistance to insects, Viruses, bacteria, fungi resistance to disease, Extend shelf life, Improving quality, Drought , salinity, resistance to extreme conditions such as frost, Improve the nutritional value and quality), we explain all this methods step by step in this research. GMO has advantages and disadvantages, which we explain all of them clearly in full text, because of this topic, worldwide researchers have divided into two. Some researchers thought that the GMO has lots of disadvantages and not to be in use, some of the researchers has opposite thought. If we look the countries law about GMO, we should know Biosafety law for each country and union. For this Biosecurity reasons, the problems caused by the transgenic plants, including Turkey, to minimize 130 countries on 24 May 2000, ‘the United Nations Biosafety Protocol’ signed nudes. This protocol has been prepared in addition to Cartagena Biosafety Protocol entered into force on September 11, 2003. This protocol GMOs in general use by addressing the risks to human health, biodiversity and sustainable transboundary movement of all GMOs that may affect the prevention, transit covers were dealt and used. Under this protocol we have to know the, ‘US Regulations GMO’, ‘European Union Regulations GMO’, ‘Turkey Regulations GMO’. These three different protocols have different applications and rules. World population increasing day by day and agricultural fields getting smaller for this reason feeding human and animal we should improve agricultural product yield and quality. Scientists trying to solve this problem and one solution way is molecular biotechnology which is including the methods of GMO too. Before decide to support or against the GMO, should know the GMO protocols and it effects.Keywords: biotechnology, GMO (genetically modified organism), molecular marker
Procedia PDF Downloads 2342394 Treadmill Negotiation: The Stagnation of the Israeli – Palestinian Peace Process
Authors: Itai Kohavi, Wojciech Nowiak
Abstract:
This article explores the stagnation of the Israeli -Palestinian peace negotiation process, and the reasons behind the failure of more than 12 international initiatives to resolve the conflict. Twenty-seven top members of the Israeli national security elite (INSE) were interviewed, including heads of the negotiation teams, the National Security Council, the Mossad, and other intelligence and planning arms. The interviewees provided their insights on the Israeli challenges in reaching a sustainable and stable peace agreement and in dealing with the international pressure on Israel to negotiate a peace agreement while preventing anti-Israeli UN decisions and sanctions. The findings revealed a decision tree, with red herring deception strategies implemented to postpone the negotiation process and to delay major decisions during the negotiation process. Beyond the possible applications for the Israeli – Palestinian conflict, the findings shed more light on the phenomenon of rational deception of allies in a negotiation process, a subject less frequently researched as compared with deception of rivals.Keywords: deception, Israeli-Palestinian conflict, negotiation, red herring, terrorist state, treadmill negotiation
Procedia PDF Downloads 3052393 Searchable Encryption in Cloud Storage
Authors: Ren Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.Keywords: fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption
Procedia PDF Downloads 3832392 Effect of Radiotherapy/Chemotherapy Protocol on the Gut Microbiome in Pediatric Cancer Patients
Authors: Nourhan G. Sahly, Ahmed Moustafa, Mohamed S. Zaghloul, Tamer Z. Salem
Abstract:
The gut microbiome plays important roles in the human body that includes but not limited to digestion, immunity, homeostasis and response to some drugs such as chemotherapy and immunotherapy. Its role has also been linked to radiotherapy and associated gastrointestinal injuries, where the microbial dysbiosis could be the driving force for dose determination or the complete suspension of the treatment protocol. Linking the gut microbiota alterations to different cancer treatment protocols is not easy especially in humans. However, enormous effort was exerted to understand this complex relationship. In the current study, we described the gut microbiota dysbiosis in pediatric sarcoma patients, in the pelvic region, with regards to radiotherapy and antibiotics. Fecal samples were collected as a source of microbial DNA for which the gene encoding for V3-V5 regions of 16S rRNA was sequenced. Two of the three patients understudy had experienced an increase in alpha diversity post exposure to 50.4 Gy. Although phylum Firmicutes overall relative abundance has generally decreased, six of its taxa increased in all patients. Our results may indicate the possibility of radiosensitivity or enrichment of the antibiotic resistance of the elevated taxa. Further studies are needed to describe the extent of radiosensitivity with regards to antibiotic resistance.Keywords: combined radiotherapy and chemotherapy, gut microbiome, pediatric cancer, radiosensitivity
Procedia PDF Downloads 1512391 Parametric Evaluation for the Optimization of Gastric Emptying Protocols Used in Health Care Institutions
Authors: Yakubu Adamu
Abstract:
The aim of this research was to assess the factors contributing to the need for optimisation of the gastric emptying protocols in nuclear medicine and molecular imaging (SNMMI) procedures. The objective is to suggest whether optimisation is possible and provide supporting evidence for the current imaging protocols of gastric emptying examination used in nuclear medicine. The research involved the use of some selected patients with 30 dynamic series for the image processing using ImageJ, and by so doing, the calculated half-time, retention fraction to the 60 x1 minute, 5 minute and 10-minute protocol, and other sampling intervals were obtained. Results from the study IDs for the gastric emptying clearance half-time were classified into normal, abnormal fast, and abnormal slow categories. In the normal category, which represents 50% of the total gastric emptying image IDs processed, their clearance half-time was within the range of 49.5 to 86.6 minutes of the mean counts. Also, under the abnormal fast category, their clearance half-time fell between 21 to 43.3 minutes of the mean counts, representing 30% of the total gastric emptying image IDs processed, and the abnormal slow category had clearance half-time within the range of 138.6 to 138.6 minutes of the mean counts, representing 20%. The results indicated that the calculated retention fraction values from the 1, 5, and 10-minute sampling curves and the measured values of gastric emptying retention fraction from sampling curves of the study IDs had a normal retention fraction of <60% and decreased exponentially with an increase in time and it was evident with low percentages of retention fraction ratios of < 10% after the 4 hours. Thus, this study does not change categories suggesting that these values could feasibly be used instead of having to acquire actual images. Findings from the study suggest that the current gastric emptying protocol can be optimized by acquiring fewer images. The study recommended that the gastric emptying studies should be performed with imaging at a minimum of 0, 1, 2, and 4 hours after meal ingestion.Keywords: gastric emptying, retention fraction, clearance halftime, optimisation, protocol
Procedia PDF Downloads 102390 The Impact of Leadership Styles and Coordination on Employees Performance in the Nigerian Banking Sector
Authors: Temilola Akinbolade, Bukola Okunade, Karounwi Okunade
Abstract:
Leadership is a subject of direction. Direction entails ensuring that employees carryout the jobs assigned to them. In order to direct subordinates, a manager must lead, motivate, communicate and ensure effective co-ordination of activities so that enterprise objectives are achieved. The purpose of the study was to find out the impact of Leadership Styles on Employees Performance, Study of Wema Bank Plc. Leadership has been described as a tool used in influencing people in order to willingly get a particular or task done. The importance of leadership is followership. That is the willingness of people to follow what makes a person a leader. A sample size of 150 was systematically selected from the study population using the statistical packages for Social Science (SPSS) formula. Based on this, questionnaire was designed and administered. Out of the 105 copies of the questionnaire administered. 150 were recovered, 45 were discarded for improper filling and mutilation while the remaining 105 were used for statistical analysis. Chi-square was employed in testing the hypothesis. The following findings were discovered in the course of the study: how leadership enhances employee’s performance, 85.7% of the respondents were in agreement. Also how implementation of workers social welfare packages enhance the employees performance. 88.6 percent of the respondents in agreement. Over the years, some leadership styles adopted by managers and administrators have an impact on the level of employee’s performance in workplace and this has led to the inefficient and ineffective attainment of organizational goals and objectives. Due to the inability of employees to perform to set standard, this research work will also indicate some ways through which high employee performance will be attained most especially with regards to the leadership style adopted by the management that is managers and administrators. It was also discovered that collective intelligence of employees leads to high employee’s performance 82.9 percent of the respondent in agreement.Keywords: leadership, employees, performance, banking sector
Procedia PDF Downloads 2442389 The Polarization on Twitter and COVID-19 Vaccination in Brazil
Authors: Giselda Cristina Ferreira, Carlos Alberto Kamienski, Ana Lígia Scott
Abstract:
The COVID-19 pandemic has enhanced the anti-vaccination movement in Brazil, supported by unscientific theories and false news and the possibility of wide communication through social networks such as Twitter, Facebook, and YouTube. The World Health Organization (WHO) classified the large volume of information on the subject against COVID-19 as an Infodemic. In this paper, we present a protocol to identify polarizing users (called polarizers) and study the profiles of Brazilian polarizers on Twitter (renamed to X some weeks ago). We analyzed polarizing interactions on Twitter (in Portuguese) to identify the main polarizers and how the conflicts they caused influenced the COVID-19 vaccination rate throughout the pandemic. This protocol uses data from this social network, graph theory, Java, and R-studio scripts to model and analyze the data. The information about the vaccination rate was obtained in a public database for the government called OpenDataSus. The results present the profiles of Twitter’s Polarizer (political position, gender, professional activity, immunization opinions). We observed that social and political events influenced the participation of these different profiles in conflicts and the vaccination rate.Keywords: Twitter, polarization, vaccine, Brazil
Procedia PDF Downloads 772388 The Impact of Study Abroad Experience on Interpreting Performance
Authors: Ruiyuan Wang, Jing Han, Bruno Di Biase, Mark Antoniou
Abstract:
The purpose of this study is to explore the relationship between working memory (WM) capacity and Chinese-English consecutive interpreting (CI) performance in interpreting learners with different study abroad experience (SAE). Such relationship is not well understood. This study also examines whether Chinese interpreting learners with SAE in English-speaking countries, demonstrate a better performance in inflectional morphology and agreement, notoriously unstable in Chinese speakers of English L2, in their interpreting output than learners without SAE. Fifty Chinese university students, majoring in Chinese-English Interpreting, were recruited in Australia (n=25) and China (n=25). The two groups matched in age, language proficiency, and interpreting training period. Study abroad (SA) group has been studying in an English-speaking country (Australia) for over 12 months, and none of the students recruited in China (the no study abroad = NSA group) had ever studied or lived in an English-speaking country. Data on language proficiency and training background were collected via a questionnaire. Lexical retrieval performance and working memory (WM) capacity data were collected experimentally, and finally, interpreting data was elicited via a direct CI task. Main results of the study show that WM significantly correlated with participants' CI performance independently of learning context. Moreover, SA outperformed NSA learners in terms of subject-verb number agreement. Apart from that, WM capacity was also found to correlate significantly with their morphosyntactic accuracy. This paper sheds some light on the relationship between study abroad, WM capacity, and CI performance. Exploring the effect of study abroad on interpreting trainees and how various important factors correlate may help interpreting educators bring forward more targeted teaching paradigms for participants with different learning experiences.Keywords: study abroad experience, consecutive interpreting, working memory, inflectional agreement
Procedia PDF Downloads 1002387 Microfluidic Plasmonic Bio-Sensing of Exosomes by Using a Gold Nano-Island Platform
Authors: Srinivas Bathini, Duraichelvan Raju, Simona Badilescu, Muthukumaran Packirisamy
Abstract:
A bio-sensing method, based on the plasmonic property of gold nano-islands, has been developed for detection of exosomes in a clinical setting. The position of the gold plasmon band in the UV-Visible spectrum depends on the size and shape of gold nanoparticles as well as on the surrounding environment. By adsorbing various chemical entities, or binding them, the gold plasmon band will shift toward longer wavelengths and the shift is proportional to the concentration. Exosomes transport cargoes of molecules and genetic materials to proximal and distal cells. Presently, the standard method for their isolation and quantification from body fluids is by ultracentrifugation, not a practical method to be implemented in a clinical setting. Thus, a versatile and cutting-edge platform is required to selectively detect and isolate exosomes for further analysis at clinical level. The new sensing protocol, instead of antibodies, makes use of a specially synthesized polypeptide (Vn96), to capture and quantify the exosomes from different media, by binding the heat shock proteins from exosomes. The protocol has been established and optimized by using a glass substrate, in order to facilitate the next stage, namely the transfer of the protocol to a microfluidic environment. After each step of the protocol, the UV-Vis spectrum was recorded and the position of gold Localized Surface Plasmon Resonance (LSPR) band was measured. The sensing process was modelled, taking into account the characteristics of the nano-island structure, prepared by thermal convection and annealing. The optimal molar ratios of the most important chemical entities, involved in the detection of exosomes were calculated as well. Indeed, it was found that the results of the sensing process depend on the two major steps: the molar ratios of streptavidin to biotin-PEG-Vn96 and, the final step, the capture of exosomes by the biotin-PEG-Vn96 complex. The microfluidic device designed for sensing of exosomes consists of a glass substrate, sealed by a PDMS layer that contains the channel and a collecting chamber. In the device, the solutions of linker, cross-linker, etc., are pumped over the gold nano-islands and an Ocean Optics spectrometer is used to measure the position of the Au plasmon band at each step of the sensing. The experiments have shown that the shift of the Au LSPR band is proportional to the concentration of exosomes and, thereby, exosomes can be accurately quantified. An important advantage of the method is the ability to discriminate between exosomes having different origins.Keywords: exosomes, gold nano-islands, microfluidics, plasmonic biosensing
Procedia PDF Downloads 174