Search results for: proxy server
164 Comparative Advantage of Mobile Agent Application in Procuring Software Products on the Internet
Authors: Michael K. Adu, Boniface K. Alese, Olumide S. Ogunnusi
Abstract:
This paper brings to fore the inherent advantages in application of mobile agents to procure software products rather than downloading software content on the Internet. It proposes a system whereby the products come on compact disk with mobile agent as deliverable. The client/user purchases a software product, but must connect to the remote server of the software developer before installation. The user provides an activation code that activates mobile agent which is part of the software product on compact disk. The validity of the activation code is checked on connection at the developer’s end to ascertain authenticity and prevent piracy. The system is implemented by downloading two different software products as compare with installing same products on compact disk with mobile agent’s application. Downloading software contents from developer’s database as in the traditional method requires a continuously open connection between the client and the developer’s end, a fixed network is not economically or technically feasible. Mobile agent after being dispatched into the network becomes independent of the creating process and can operate asynchronously and autonomously. It can reconnect later after completing its task and return for result delivery. Response Time and Network Load are very minimal with application of Mobile agent.Keywords: software products, software developer, internet, activation code, mobile agent
Procedia PDF Downloads 312163 Statistical Analysis with Prediction Models of User Satisfaction in Software Project Factors
Authors: Katawut Kaewbanjong
Abstract:
We analyzed a volume of data and found significant user satisfaction in software project factors. A statistical significance analysis (logistic regression) and collinearity analysis determined the significance factors from a group of 71 pre-defined factors from 191 software projects in ISBSG Release 12. The eight prediction models used for testing the prediction potential of these factors were Neural network, k-NN, Naïve Bayes, Random forest, Decision tree, Gradient boosted tree, linear regression and logistic regression prediction model. Fifteen pre-defined factors were truly significant in predicting user satisfaction, and they provided 82.71% prediction accuracy when used with a neural network prediction model. These factors were client-server, personnel changes, total defects delivered, project inactive time, industry sector, application type, development type, how methodology was acquired, development techniques, decision making process, intended market, size estimate approach, size estimate method, cost recording method, and effort estimate method. These findings may benefit software development managers considerably.Keywords: prediction model, statistical analysis, software project, user satisfaction factor
Procedia PDF Downloads 124162 Computer Assisted Learning Module (CALM) for Consumer Electronics Servicing
Authors: Edicio M. Faller
Abstract:
The use of technology in the delivery of teaching and learning is vital nowadays especially in education. Computer Assisted Learning Module (CALM) software is the use of computer in the delivery of instruction with a tailored fit program intended for a specific lesson or a set of topics. The CALM software developed in this study is intended to supplement the traditional teaching methods in technical-vocational (TECH-VOC) instruction specifically the Consumer Electronics Servicing course. There are three specific objectives of this study. First is to create a learning enhancement and review materials on the selected lessons. Second, is to computerize the end-of-chapter quizzes. Third, is to generate a computerized mock exam and summative assessment. In order to obtain the objectives of the study the researcher adopted the Agile Model where the development of the study undergoes iterative and incremental process of the Software Development Life Cycle. The study conducted an acceptance testing using a survey questionnaire to evaluate the CALM software. The results showed that CALM software was generally interpreted as very satisfactory. To further improve the CALM software it is recommended that the program be updated, enhanced and lastly, be converted from stand-alone to a client/server architecture.Keywords: computer assisted learning module, software development life cycle, computerized mock exam, consumer electronics servicing
Procedia PDF Downloads 395161 CanVis: Towards a Web Platform for Cancer Progression Tree Analysis
Authors: Michael Aupetit, Mahmoud Al-ismail, Khaled Mohamed
Abstract:
Cancer is a major public health problem all over the world. Breast cancer has the highest incidence rate over all cancers for women in Qatar making its study a top priority of the country. Human cancer is a dynamic disease that develops over an extended period through the accumulation of a series of genetic alterations. A Darwinian process drives the tumor cells toward higher malignancy growing the branches of a progression tree in the space of genes expression. Although it is not possible to track these genetic alterations dynamically for one patient, it is possible to reconstruct the progression tree from the aggregation of thousands of tumor cells’ genetic profiles from thousands of different patients at different stages of the disease. Analyzing the progression tree is a way to detect pivotal molecular events that drive the malignant evolution and to provide a guide for the development of cancer diagnostics, prognostics and targeted therapeutics. In this work we present the development of a Visual Analytic web platform CanVis enabling users to upload gene-expression data and analyze their progression tree. The server computes the progression tree based on state-of-the-art techniques and allows an interactive visual exploration of this tree and the gene-expression data along its branching structure helping to discover potential driver genes.Keywords: breast cancer, progression tree, visual analytics, web platform
Procedia PDF Downloads 419160 Post Apartheid Language Positionality and Policy: Student Teachers' Narratives from Teaching Practicum
Authors: Thelma Mort
Abstract:
This empirical, qualitative research uses interviews of four intermediate phase English language student teachers at one university in South Africa and is an exploration of student teacher learning on their teaching practicum in their penultimate year of the initial teacher education course. The country’s post-apartheid language in education policy provides a context to this study in that children move from mother tongue language of instruction in foundation phase to English as a language of instruction in Intermediate phase. There is another layer of context informing this study which is the school context; the student teachers’ reflections are from their teaching practicum in resource constrained schools, which make up more than 75% of schools in South Africa. The findings were that in these schools, deep biases existed to local languages, that language was being used as a proxy for social class, and that conditions necessary for language acquisition were absent. The student teachers’ attitudes were in contrast to those found in the schools, namely that they had various pragmatic approaches to overcoming obstacles and that they saw language as enabling interdisciplinary work. This study describes language issues, tensions created by policy in South African schools and also supplies a regional account of learning to teach in resource constrained schools in Cape Town, where such language tensions are more inflated. The central findings in this research illuminate attitudes to language and language education in these teaching practicum schools and the complexity of learning to be a language teacher in these contexts. This study is one of the few local empirical studies regarding language teaching in the classroom and language teacher education; as such it offers some background to the country’s poor performance in both international and national literacy assessments.Keywords: language teaching, narrative, post apartheid, South Africa, student teacher
Procedia PDF Downloads 147159 Study on Resource Allocation of Cloud Operating System Based on Multi-Tenant Data Resource Sharing Technology
Authors: Lin Yunuo, Seow Xing Quan, Burra Venkata Durga Kumar
Abstract:
In this modern era, the cloud operating system is the world trend applied in various industries such as business, healthy, etc. In order to deal with the large capacity of requirements in cloud computing, research come up with multi-tenant cloud computing to maximize the benefits of server providers and clients. However, there are still issues in multi-tenant cloud computing especially regarding resource allocation. Issues such as inefficient resource utilization, large latency, lack of scalability and elasticity and poor data isolation had caused inefficient resource allocation in multi-tenant cloud computing. Without a doubt, these issues prevent multitenancy reaches its best condition. In fact, there are multiple studies conducted to determine the optimal resource allocation to solve these problems these days. This article will briefly introduce the cloud operating system, Multi-tenant cloud computing and resource allocation in cloud computing. It then discusses resource allocation in multi-tenant cloud computing and the current challenges it faces. According to the issue ‘ineffective resource utilization’, we will discuss an efficient dynamic scheduling technique for multitenancy, namely Multi-tenant Dynamic Resource Scheduling Model (MTDRSM). Moreover, there also have some recommendations to improve the shortcoming of this model in this paper’s final section.Keywords: cloud computing, cloud operation system, multitenancy, resource allocation, utilization of cloud resources
Procedia PDF Downloads 87158 Using Surface Entropy Reduction to Improve the Crystallization Properties of a Recombinant Antibody Fragment RNA Crystallization Chaperone
Authors: Christina Roman, Deepak Koirala, Joseph A. Piccirilli
Abstract:
Phage displaying synthetic Fab libraries have been used to obtain Fabs that bind to specific RNA targets with high affinity and specificity. These Fabs have been demonstrated to facilitate RNA crystallization. However, the antibody framework used in the construction of these phage display libraries contains numerous bulky, flexible, and charged residues, which facilitate solubility and hinder aggregation. These residues can interfere with crystallization due to the entropic cost associated with burying them within crystal contacts. To systematically reduce the surface entropy of the Fabs and improve their crystallization properties, a protein engineering strategy termed surface entropy reduction (SER) is being applied to the Fab framework. In this approach, high entropy residues are mutated to smaller ones such as alanine or serine. Focusing initially on Fab BL3-6, which binds an RNA AAACA pentaloop with 20nM affinity, the SER P server (http://services.mbi.ucla.edu/SER/) was used and analysis was performed on existing RNA-Fab BL3-6 co-crystal structures. From this analysis twelve surface entropy reduced mutants were designed. These SER mutants were expressed and are now being measured for their crystallization and diffraction performance with various RNA targets. So far, one mutant has generated 3.02 angstrom diffraction with the yjdF riboswitch RNA. Ultimately, the most productive mutations will be combined into a new Fab framework to be used in a optimized phage displayed Fab library.Keywords: antibody fragment, crystallography, RNA, surface entropy reduction
Procedia PDF Downloads 197157 An Assessment into the Drift in Direction of International Migration of Labor: Changing Aspirations for Religiosity and Cultural Assimilation
Authors: Syed Toqueer Akhter, Rabia Zulfiqar
Abstract:
This paper attempts to trace the determining factor- as far as individual preferences and expectations are concerned- of what causes the direction of international migration to drift in certain ways owing to factors such as Religiosity and Cultural Assimilation. The narrative on migration has graduated from the age long ‘push/pull’ debate to that of complex factors that may vary across each individual. We explore the longstanding factor of religiosity widely acknowledged in mentioned literature as a key variable in the assessment of migration, wherein the impact of religiosity in the form of a drift into the intent of migration has been analyzed. A more conventional factor cultural assimilation is used in a contemporary way to estimate how it plays a role in affecting the drift in direction. In particular what our research aims at achieving is to isolate the effect our key variables: Cultural Assimilation and Religiosity have on direction of migration, and to explore how they interplay as a composite unit- and how we may be able to justify the change in behavior displayed by these key variables. In order to establish a true sense of what drives individual choices we employ the method of survey research and use a questionnaire to conduct primary research. The questionnaire was divided into six sections covering subjects including household characteristics, perceptions and inclinations of the respondents relevant to our study. Religiosity was quantified using a proxy of Migration Network that utilized secondary data to estimate religious hubs in recipient countries. To estimate the relationship between Intent of Migration and its variants three competing econometric models namely: the Ordered Probit Model, the Ordered Logit Model and the Tobit Model were employed. For every model that included our key variables, a highly significant relationship with the intent of migration was estimated.Keywords: international migration, drift in direction, cultural assimilation, religiosity, ordered probit model
Procedia PDF Downloads 307156 On the Design of a Secure Two-Party Authentication Scheme for Internet of Things Using Cancelable Biometrics and Physically Unclonable Functions
Authors: Behnam Zahednejad, Saeed Kosari
Abstract:
Widespread deployment of Internet of Things (IoT) has raised security and privacy issues in this environment. Designing a secure two-factor authentication scheme between the user and server is still a challenging task. In this paper, we focus on Cancelable Biometric (CB) as an authentication factor in IoT. We show that previous CB-based scheme fail to provide real two-factor security, Perfect Forward Secrecy (PFS) and suffer database attacks and traceability of the user. Then we propose our improved scheme based on CB and Physically Unclonable Functions (PUF), which can provide real two-factor security, PFS, user’s unlinkability, and resistance to database attack. In addition, Key Compromise Impersonation (KCI) resilience is achieved in our scheme. We also prove the security of our proposed scheme formally using both Real-Or-Random (RoR) model and the ProVerif analysis tool. For the usability of our scheme, we conducted a performance analysis and showed that our scheme has the least communication cost compared to the previous CB-based scheme. The computational cost of our scheme is also acceptable for the IoT environment.Keywords: IoT, two-factor security, cancelable biometric, key compromise impersonation resilience, perfect forward secrecy, database attack, real-or-random model, ProVerif
Procedia PDF Downloads 102155 Synthetic Method of Contextual Knowledge Extraction
Authors: Olga Kononova, Sergey Lyapin
Abstract:
Global information society requirements are transparency and reliability of data, as well as ability to manage information resources independently; particularly to search, to analyze, to evaluate information, thereby obtaining new expertise. Moreover, it is satisfying the society information needs that increases the efficiency of the enterprise management and public administration. The study of structurally organized thematic and semantic contexts of different types, automatically extracted from unstructured data, is one of the important tasks for the application of information technologies in education, science, culture, governance and business. The objectives of this study are the contextual knowledge typologization, selection or creation of effective tools for extracting and analyzing contextual knowledge. Explication of various kinds and forms of the contextual knowledge involves the development and use full-text search information systems. For the implementation purposes, the authors use an e-library 'Humanitariana' services such as the contextual search, different types of queries (paragraph-oriented query, frequency-ranked query), automatic extraction of knowledge from the scientific texts. The multifunctional e-library «Humanitariana» is realized in the Internet-architecture in WWS-configuration (Web-browser / Web-server / SQL-server). Advantage of use 'Humanitariana' is in the possibility of combining the resources of several organizations. Scholars and research groups may work in a local network mode and in distributed IT environments with ability to appeal to resources of any participating organizations servers. Paper discusses some specific cases of the contextual knowledge explication with the use of the e-library services and focuses on possibilities of new types of the contextual knowledge. Experimental research base are science texts about 'e-government' and 'computer games'. An analysis of the subject-themed texts trends allowed to propose the content analysis methodology, that combines a full-text search with automatic construction of 'terminogramma' and expert analysis of the selected contexts. 'Terminogramma' is made out as a table that contains a column with a frequency-ranked list of words (nouns), as well as columns with an indication of the absolute frequency (number) and the relative frequency of occurrence of the word (in %% ppm). The analysis of 'e-government' materials showed, that the state takes a dominant position in the processes of the electronic interaction between the authorities and society in modern Russia. The media credited the main role in these processes to the government, which provided public services through specialized portals. Factor analysis revealed two factors statistically describing the used terms: human interaction (the user) and the state (government, processes organizer); interaction management (public officer, processes performer) and technology (infrastructure). Isolation of these factors will lead to changes in the model of electronic interaction between government and society. In this study, the dominant social problems and the prevalence of different categories of subjects of computer gaming in science papers from 2005 to 2015 were identified. Therefore, there is an evident identification of several types of contextual knowledge: micro context; macro context; dynamic context; thematic collection of queries (interactive contextual knowledge expanding a composition of e-library information resources); multimodal context (functional integration of iconographic and full-text resources through hybrid quasi-semantic algorithm of search). Further studies can be pursued both in terms of expanding the resource base on which they are held, and in terms of the development of appropriate tools.Keywords: contextual knowledge, contextual search, e-library services, frequency-ranked query, paragraph-oriented query, technologies of the contextual knowledge extraction
Procedia PDF Downloads 360154 Nonlinear Multivariable Analysis of CO2 Emissions in China
Authors: Hsiao-Tien Pao, Yi-Ying Li, Hsin-Chia Fu
Abstract:
This paper addressed the impacts of energy consumption, economic growth, financial development, and population size on environmental degradation using grey relational analysis (GRA) for China, where foreign direct investment (FDI) inflows is the proxy variable for financial development. The more recent historical data during the period 2004–2011 are used, because the use of very old data for data analysis may not be suitable for rapidly developing countries. The results of the GRA indicate that the linkage effects of energy consumption–emissions and GDP–emissions are ranked first and second, respectively. These reveal that energy consumption and economic growth are strongly correlated with emissions. Higher economic growth requires more energy consumption and increasing environmental pollution. Likewise, more efficient energy use needs a higher level of economic development. Therefore, policies to improve energy efficiency and create a low-carbon economy can reduce emissions without hurting economic growth. The finding of FDI–emissions linkage is ranked third. This indicates that China do not apply weak environmental regulations to attract inward FDI. Furthermore, China’s government in attracting inward FDI should strengthen environmental policy. The finding of population–emissions linkage effect is ranked fourth, implying that population size does not directly affect CO2 emissions, even though China has the world’s largest population, and Chinese people are very economical use of energy-related products. Overall, the energy conservation, improving efficiency, managing demand, and financial development, which aim at curtailing waste of energy, reducing both energy consumption and emissions, and without loss of the country’s competitiveness, can be adopted for developing economies. The GRA is one of the best way to use a lower data to build a dynamic analysis model.Keywords: China, CO₂ emissions, foreign direct investment, grey relational analysis
Procedia PDF Downloads 404153 Entry Inhibitors Are Less Effective at Preventing Cell-Associated HIV-2 Infection than HIV-1
Authors: A. R. Diniz, P. Borrego, I. Bártolo, N. Taveira
Abstract:
Cell-to-cell transmission plays a critical role in the spread of HIV-1 infection in vitro and in vivo. Inhibition of HIV-1 cell-associated infection by antiretroviral drugs and neutralizing antibodies (NAbs) is more difficult compared to cell-free infection. Limited data exists on cell-associated infection by HIV-2 and its inhibition. In this work, we determined the ability of entry inhibitors to inhibit HIV-1 and HIV-2 cell-to cell fusion as a proxy to cell-associated infection. We developed a method in which Hela-CD4-cells are first transfected with a Tat expressing plasmid (pcDNA3.1+/Tat101) and infected with recombinant vaccinia viruses expressing either the HIV-1 (vPE16: from isolate HTLV-IIIB, clone BH8, X4 tropism) or HIV-2 (vSC50: from HIV-2SBL/ISY, R5 and X4 tropism) envelope glycoproteins (M.O.I.=1 PFU/cell).These cells are added to TZM-bl cells. When cell-to-cell fusion (syncytia) occurs the Tat protein diffuses to the TZM-bl cells activating the expression of a reporter gene (luciferase). We tested several entry inhibitors including the fusion inhibitors T1249, T20 and P3, the CCR5 antagonists MVC and TAK-779, the CXCR4 antagonist AMD3100 and several HIV-2 neutralizing antibodies (Nabs). All compounds inhibited HIV-1 and HIV-2 cell fusion albeit to different levels. Maximum percentage of HIV-2 inhibition (MPI) was higher for fusion inhibitors (T1249- 99.8%; P3- 95%, T20-90%) followed by co-receptor antagonists (MVC- 63%; TAK-779- 55%; AMD3100- 45%). NAbs from HIV-2 infected patients did not prevent cell fusion up to the tested concentration of 4μg/ml. As for HIV-1, MPI reached 100% with TAK-779 and T1249. For the other antivirals, MPIs were: P3-79%; T20-75%; AMD3100-61%; MVC-65%.These results are consistent with published data. Maraviroc had the lowest IC50 both for HIV-2 and HIV-1 (IC50 HIV-2= 0.06 μM; HIV-1=0.0076μM). Highest IC50 were observed with T20 for HIV-2 (3.86μM) and with TAK-779 for HIV-1 (12.64μM). Overall, our results show that entry inhibitors in clinical use are less effective at preventing Env mediated cell-to-cell-fusion in HIV-2 than in HIV-1 which suggests that cell-associated HIV-2 infection will be more difficult to inhibit compared to HIV-1. The method described here will be useful to screen for new HIV entry inhibitors.Keywords: cell-to-cell fusion, entry inhibitors, HIV, NAbs, vaccinia virus
Procedia PDF Downloads 310152 The Impact of Financial Risk on Banks’ Financial Performance: A Comparative Study of Islamic Banks and Conventional Banks in Pakistan
Authors: Mohammad Yousaf Safi Mohibullah Afghan
Abstract:
The study made on Islamic and conventional banks scrutinizes the risks interconnected with credit and liquidity on the productivity performance of Islamic and conventional banks that operate in Pakistan. Among the banks, only 4 Islamic and 18 conventional banks have been selected to enrich the result of our study on Islamic banks performance in connection to conventional banks. The selection of the banks to the panel is based on collecting quarterly unbalanced data ranges from the first quarter of 2007 to the last quarter of 2017. The data are collected from the Bank’s web sites and State Bank of Pakistan. The data collection is carried out based on Delta-method test. The mentioned test is used to find out the empirical results. In the study, while collecting data on the banks, the return on assets and return on equity have been major factors that are used assignificant proxies in determining the profitability of the banks. Moreover, another major proxy is used in measuring credit and liquidity risks, the loan loss provision to total loan and the ratio of liquid assets to total liability. Meanwhile, with consideration to the previous literature, some other variables such as bank size, bank capital, bank branches, and bank employees have been used to tentatively control the impact of those factors whose direct and indirect effects on profitability is understood. In conclusion, the study emphasizes that credit risk affects return on asset and return on equity positively, and there is no significant difference in term of credit risk between Islamic and conventional banks. Similarly, the liquidity risk has a significant impact on the bank’s profitability, though the marginal effect of liquidity risk is higher for Islamic banks than conventional banks.Keywords: islamic & conventional banks, performance return on equity, return on assets, pakistan banking sectors, profitibility
Procedia PDF Downloads 166151 Mining the Proteome of Fusobacterium nucleatum for Potential Therapeutics Discovery
Authors: Abdul Musaweer Habib, Habibul Hasan Mazumder, Saiful Islam, Sohel Sikder, Omar Faruk Sikder
Abstract:
The plethora of genome sequence information of bacteria in recent times has ushered in many novel strategies for antibacterial drug discovery and facilitated medical science to take up the challenge of the increasing resistance of pathogenic bacteria to current antibiotics. In this study, we adopted subtractive genomics approach to analyze the whole genome sequence of the Fusobacterium nucleatum, a human oral pathogen having association with colorectal cancer. Our study divulged 1499 proteins of Fusobacterium nucleatum, which has no homolog in human genome. These proteins were subjected to screening further by using the Database of Essential Genes (DEG) that resulted in the identification of 32 vitally important proteins for the bacterium. Subsequent analysis of the identified pivotal proteins, using the KEGG Automated Annotation Server (KAAS) resulted in sorting 3 key enzymes of F. nucleatum that may be good candidates as potential drug targets, since they are unique for the bacterium and absent in humans. In addition, we have demonstrated the 3-D structure of these three proteins. Finally, determination of ligand binding sites of the key proteins as well as screening for functional inhibitors that best fitted with the ligands sites were conducted to discover effective novel therapeutic compounds against Fusobacterium nucleatum.Keywords: colorectal cancer, drug target, Fusobacterium nucleatum, homology modeling, ligands
Procedia PDF Downloads 389150 Is Audit Quality Implied by Accruals Quality Associated with Audit Fees and Auditor Tenure? Evidence from China
Authors: Hassan Y. Kikhia, Jin P. Zhang, Khaldoon G. Albiatr
Abstract:
The Enron and Arthur Andersen scandal has raised concerns internationally about auditor independence and audit quality. Furthermore, the debate continues about the relationship between audit fees, auditor tenure and audit quality in spite of extensive empirical evidence examining audit failures and earnings management. Therefore, the purpose of current research is to determine the effect of audit fee and audit tenure both partially and simultaneously on the audit quality. Using a sample of Chinese firms, an environment where we believe it provides us with an opportunity to test whether the development of market and legal institutions affects the impact of audit fees and auditor tenure on audit quality. We employ the standard deviation of residuals from regressions relating current accruals to cash flows as proxy for audit quality. The paper documents statistically significant negative association between audit fees and audit quality. These findings are consistent with economic bonding being a determinant of auditor behavior rather than auditor reputational concerns. Further, the current paper shows a positive association between auditor tenure and audit quality in the earlier years of audit tenure. These results support the proposition that when the Learning Effect dominates the Bonding Effect in the earlier years of tenure, then audit quality is likely to be higher. Taken audit fees and audit tenure together, the results suggest that there is positive association between audit fees and audit quality in the earlier years of auditor tenure. Interestingly, the findings of our study have important implications for auditors, policymakers, multinational firms, and users of financial reports. As the rapid growth of China's economy gains global recognition, the Chinese stock market is capturing the attention of international investors. To a lesser extent, our paper also differs from the prior studies in methodology and findings in the investigation of audit quality.Keywords: audit quality, accruals quality, audit fees, auditor tenure
Procedia PDF Downloads 281149 Design and Realization of Computer Network Security Perception Control System
Authors: El Miloudi Djelloul
Abstract:
Based on analysis on applications by perception control technology in computer network security status and security protection measures, from the angles of network physical environment and network software system environmental security, this paper provides network security system perception control solution using Internet of Things (IOT), telecom and other perception technologies. Security Perception Control System is in the computer network environment, utilizing Radio Frequency Identification (RFID) of IOT and telecom integration technology to carry out integration design for systems. In the network physical security environment, RFID temperature, humidity, gas and perception technologies are used to do surveillance on environmental data, dynamic perception technology is used for network system security environment, user-defined security parameters, security log are used for quick data analysis, extends control on I/O interface, by development of API and AT command, Computer Network Security Perception Control based on Internet and GSM/GPRS is achieved, which enables users to carry out interactive perception and control for network security environment by WEB, E-MAIL as well as PDA, mobile phone short message and Internet. In the system testing, through middle ware server, security information data perception in real time with deviation of 3-5% was achieved; it proves the feasibility of Computer Network Security Perception Control System.Keywords: computer network, perception control system security strategy, Radio Frequency Identification (RFID)
Procedia PDF Downloads 447148 Comparison of Authentication Methods in Internet of Things Technology
Authors: Hafizah Che Hasan, Fateen Nazwa Yusof, Maslina Daud
Abstract:
Internet of Things (IoT) is a powerful industry system, which end-devices are interconnected and automated, allowing the devices to analyze data and execute actions based on the analysis. The IoT technology leverages the technology of Radio-Frequency Identification (RFID) and Wireless Sensor Network (WSN), including mobile and sensor. These technologies contribute to the evolution of IoT. However, due to more devices are connected each other in the Internet, and data from various sources exchanged between things, confidentiality of the data becomes a major concern. This paper focuses on one of the major challenges in IoT; authentication, in order to preserve data integrity and confidentiality are in place. A few solutions are reviewed based on papers from the last few years. One of the proposed solutions is securing the communication between IoT devices and cloud servers with Elliptic Curve Cryptograhpy (ECC) based mutual authentication protocol. This solution focuses on Hyper Text Transfer Protocol (HTTP) cookies as security parameter. Next proposed solution is using keyed-hash scheme protocol to enable IoT devices to authenticate each other without the presence of a central control server. Another proposed solution uses Physical Unclonable Function (PUF) based mutual authentication protocol. It emphasizes on tamper resistant and resource-efficient technology, which equals a 3-way handshake security protocol.Keywords: Internet of Things (IoT), authentication, PUF ECC, keyed-hash scheme protocol
Procedia PDF Downloads 265147 Conceptual Model of a Residential Waste Collection System Using ARENA Software
Authors: Bruce G. Wilson
Abstract:
The collection of municipal solid waste at the curbside is a complex operation that is repeated daily under varying circumstances around the world. There have been several attempts to develop Monte Carlo simulation models of the waste collection process dating back almost 50 years. Despite this long history, the use of simulation modeling as a planning or optimization tool for waste collection is still extremely limited in practice. Historically, simulation modeling of waste collection systems has been hampered by the limitations of computer hardware and software and by the availability of representative input data. This paper outlines the development of a Monte Carlo simulation model that overcomes many of the limitations contained in previous models. The model uses a general purpose simulation software program that is easily capable of modeling an entire waste collection network. The model treats the stops on a waste collection route as a queue of work to be processed by a collection vehicle (or server). Input data can be collected from a variety of sources including municipal geographic information systems, global positioning system recorders on collection vehicles, and weigh scales at transfer stations or treatment facilities. The result is a flexible model that is sufficiently robust that it can model the collection activities in a large municipality, while providing the flexibility to adapt to changing conditions on the collection route.Keywords: modeling, queues, residential waste collection, Monte Carlo simulation
Procedia PDF Downloads 401146 To Ensure Maximum Voter Privacy in E-Voting Using Blockchain, Convolutional Neural Network, and Quantum Key Distribution
Authors: Bhaumik Tyagi, Mandeep Kaur, Kanika Singla
Abstract:
The advancement of blockchain has facilitated scholars to remodel e-voting systems for future generations. Server-side attacks like SQL injection attacks and DOS attacks are the most common attacks nowadays, where malicious codes are injected into the system through user input fields by illicit users, which leads to data leakage in the worst scenarios. Besides, quantum attacks are also there which manipulate the transactional data. In order to deal with all the above-mentioned attacks, integration of blockchain, convolutional neural network (CNN), and Quantum Key Distribution is done in this very research. The utilization of blockchain technology in e-voting applications is not a novel concept. But privacy and security issues are still there in a public and private blockchains. To solve this, the use of a hybrid blockchain is done in this research. This research proposed cryptographic signatures and blockchain algorithms to validate the origin and integrity of the votes. The convolutional neural network (CNN), a normalized version of the multilayer perceptron, is also applied in the system to analyze visual descriptions upon registration in a direction to enhance the privacy of voters and the e-voting system. Quantum Key Distribution is being implemented in order to secure a blockchain-based e-voting system from quantum attacks using quantum algorithms. Implementation of e-voting blockchain D-app and providing a proposed solution for the privacy of voters in e-voting using Blockchain, CNN, and Quantum Key Distribution is done.Keywords: hybrid blockchain, secure e-voting system, convolutional neural networks, quantum key distribution, one-time pad
Procedia PDF Downloads 95145 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service
Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.Keywords: medical image, QoS, simulated annealing, Tabu search, telemedicine
Procedia PDF Downloads 219144 Improved Regression Relations Between Different Magnitude Types and the Moment Magnitude in the Western Balkan Earthquake Catalogue
Authors: Anila Xhahysa, Migena Ceyhan, Neki Kuka, Klajdi Qoshi, Damiano Koxhaj
Abstract:
The seismic event catalog has been updated in the framework of a bilateral project supported by the Central European Investment Fund and with the extensive support of Global Earthquake Model Foundation to update Albania's national seismic hazard model. The earthquake catalogue prepared within this project covers the Western Balkan area limited by 38.0° - 48°N, 12.5° - 24.5°E and includes 41,806 earthquakes that occurred in the region between 510 BC and 2022. Since the moment magnitude characterizes the earthquake size accurately and the selected ground motion prediction equations for the seismic hazard assessment employ this scale, it was chosen as the uniform magnitude scale for the catalogue. Therefore, proxy values of moment magnitude had to be obtained by using new magnitude conversion equations between the local and other magnitude types to this unified scale. The Global Centroid Moment Tensor Catalogue was considered the most authoritative for moderate to large earthquakes for moment magnitude reports; hence it was used as a reference for calibrating other sources. The best fit was observed when compared to some regional agencies, whereas, with reports of moment magnitudes from Italy, Greece and Turkey, differences were observed in all magnitude ranges. For teleseismic magnitudes, to account for the non-linearity of the relationships, we used the exponential model for the derivation of the regression equations. The obtained regressions for the surface wave magnitude and short-period body-wave magnitude show considerable differences with Global Earthquake Model regression curves, especially for low magnitude ranges. Moreover, a conversion relation was obtained between the local magnitude of Albania and the corresponding moment magnitude as reported by the global and regional agencies. As errors were present in both variables, the Deming regression was used.Keywords: regression, seismic catalogue, local magnitude, tele-seismic magnitude, moment magnitude
Procedia PDF Downloads 71143 Prioritized Processor-Sharing with a Maximum Permissible Sojourn Time
Authors: Yoshiaki Shikata
Abstract:
A prioritized processor-sharing (PS) system with a maximum permissible sojourn time (MPST) is proposed. In this PS system, a higher-priority request is allocated a larger service ratio than a lower-priority request. Moreover, each request receiving service is guaranteed the maximum permissible sojourn time determined by each priority class, regardless of its service time. Arriving requests that cannot receive service due to this guarantee are rejected. We further propose a guarantee method for implementing such a system, and discuss performance evaluation procedures for the resulting system. Practical performance measures, such as the relationships between the loss probability or mean sojourn time of each class request and the maximum permissible sojourn time are evaluated via simulation. At the arrival of each class request, its acceptance or rejection is judged using extended sojourn times of all requests receiving service in the server. As the MPST increases, the mean sojourn time increases almost linearly. However, the logarithm of the loss probability decreases almost linearly. Moreover with an MPST, the difference in the mean sojourn time for different MPSTs increases with the traffic rate. Conversely, the difference in the loss probability for different MPSTs decreases as the traffic rate increases.Keywords: prioritized processor sharing, priority ratio, permissible sojourn time, loss probability, mean sojourn time, simulation
Procedia PDF Downloads 193142 Optimising Participation in Physical Activity Research for Adults with Intellectual Disabilities
Authors: Yetunde M. Dairo, Johnny Collett, Helen Dawes
Abstract:
Background and Aim: Engagement with physical activity (PA) research is poor among adults with intellectual disabilities (ID), particularly in those from residential homes. This study explored why, by asking managers of residential homes, adults with ID and their carers. Methods: Participants: A convenient sample of 23 individuals from two UK local authorities, including a group of ID residential home managers, adults with ID and their support staff. Procedures: A) Residential home managers (n=6) were asked questions about their willingness to allow their residents to participate in PA research; B) eleven adults with ID and their support workers (n=6) were asked questions about their willingness to accept 7-day accelerometer monitoring and/or the International Physical Activity Questionnaire-short version (IPAQ-s) as PA measures. The IPAQ-s was administered by the researcher and they were each provided with samples of accelerometers to try on. Results: A) Five out of six managers said that the burden of wearing the accelerometer for seven days would be too high for the people they support, the majority of whom might be unable to express their wishes. They also said they would be unwilling to act as proxy respondents for the same reason. Additionally, they cited time pressure, understaffing, and reluctance to spend time on the research paperwork as further reasons for non-participation. B) All 11 individuals with ID completed the IPAQ-s while only three accepted the accelerometer, one of whom was deemed inappropriate to wear it. Reasons for rejecting accelerometers included statements from participants of: ‘too expensive’, ‘too heavy’, ‘uncomfortable’, and two people said they would not want to wear it for more than one day. All adults with ID (11) and their support workers (6) provided information about their physical activity levels through the IPAQ-s. Conclusions: Care home managers are a barrier to research participation. However, adults with ID would be happy for the IPAQ-s as a PA measure, but less so for the 7-day accelerometer monitoring. In order to improve participation in this population, the choice of PA measure is considered important. Moreover, there is a need for studies exploring how best to engage ID residential home managers in PA research.Keywords: intellectual disability, physical activity measurement, research engagement, research participation
Procedia PDF Downloads 310141 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 137140 Experimental Study and Evaluation of Farm Environmental Monitoring System Based on the Internet of Things, Sudan
Authors: Farid Eltom A. E., Mustafa Abdul-Halim, Abdalla Markaz, Sami Atta, Mohamed Azhari, Ahmed Rashed
Abstract:
Smart environment sensors integrated with ‘Internet of Things’ (IoT) technology can provide a new concept in tracking, sensing, and monitoring objects in the environment. The aim of the study is to evaluate the farm environmental monitoring system based on (IoT) and to realize the automated management of agriculture and the implementation of precision production. Until now, irrigation monitoring operations in Sudan have been carried out using traditional methods, which is a very costly and unreliable mechanism. However, by utilizing soil moisture sensors, irrigation can be conducted only when needed without fear of plant water stress. The result showed that software application allows farmers to display current and historical data on soil moisture and nutrients in the form of line charts. Design measurements of the soil factors: moisture, electrical, humidity, conductivity, temperature, pH, phosphorus, and potassium; these factors, together with a timestamp, are sent to the data server using the Lora WAN interface. It is considered scientifically agreed upon in the modern era that artificial intelligence works to arrange the necessary procedures to take care of the terrain, predict the quality and quantity of production through deep analysis of the various operations in agricultural fields, and also support monitoring of weather conditions.Keywords: smart environment, monitoring systems, IoT, LoRa Gateway, center pivot
Procedia PDF Downloads 49139 A Bi-Objective Model to Optimize the Total Time and Idle Probability for Facility Location Problem Behaving as M/M/1/K Queues
Authors: Amirhossein Chambari
Abstract:
This article proposes a bi-objective model for the facility location problem subject to congestion (overcrowding). Motivated by implementations to locate servers in internet mirror sites, communication networks, one-server-systems, so on. This model consider for situations in which immobile (or fixed) service facilities are congested (or queued) by stochastic demand to behave as M/M/1/K queues. We consider for this problem two simultaneous perspectives; (1) Customers (desire to limit times of accessing and waiting for service) and (2) Service provider (desire to limit average facility idle-time). A bi-objective model is setup for facility location problem with two objective functions; (1) Minimizing sum of expected total traveling and waiting time (customers) and (2) Minimizing the average facility idle-time percentage (service provider). The proposed model belongs to the class of mixed-integer nonlinear programming models and the class of NP-hard problems. In addition, to solve the model, controlled elitist non-dominated sorting genetic algorithms (Controlled NSGA-II) and controlled elitist non-dominated ranking genetic algorithms (NRGA-I) are proposed. Furthermore, the two proposed metaheuristics algorithms are evaluated by establishing standard multiobjective metrics. Finally, the results are analyzed and some conclusions are given.Keywords: bi-objective, facility location, queueing, controlled NSGA-II, NRGA-I
Procedia PDF Downloads 584138 Reconfigurable Device for 3D Visualization of Three Dimensional Surfaces
Authors: Robson da C. Santos, Carlos Henrique de A. S. P. Coutinho, Lucas Moreira Dias, Gerson Gomes Cunha
Abstract:
The article refers to the development of an augmented reality 3D display, through the control of servo motors and projection of image with aid of video projector on the model. Augmented Reality is a branch that explores multiple approaches to increase real-world view by viewing additional information along with the real scene. The article presents the broad use of electrical, electronic, mechanical and industrial automation for geospatial visualizations, applications in mathematical models with the visualization of functions and 3D surface graphics and volumetric rendering that are currently seen in 2D layers. Application as a 3D display for representation and visualization of Digital Terrain Model (DTM) and Digital Surface Models (DSM), where it can be applied in the identification of canyons in the marine area of the Campos Basin, Rio de Janeiro, Brazil. The same can execute visualization of regions subject to landslides, as in Serra do Mar - Agra dos Reis and Serranas cities both in the State of Rio de Janeiro. From the foregoing, loss of human life and leakage of oil from pipelines buried in these regions may be anticipated in advance. The physical design consists of a table consisting of a 9 x 16 matrix of servo motors, totalizing 144 servos, a mesh is used on the servo motors for visualization of the models projected by a retro projector. Each model for by an image pre-processing, is sent to a server to be converted and viewed from a software developed in C # Programming Language.Keywords: visualization, 3D models, servo motors, C# programming language
Procedia PDF Downloads 342137 Association between Levels of Volatile Organic Compound Metabolites and Cigarette Smoking-Related Urothelial Carcinoma
Authors: Chi-Jung Chung, Chao-Hsiang Chang, Chiu-Shong Liu, Sheng-Wei Li, Mu-Chi Chung, Ting-Jie Wen, Hui-Ling Lee
Abstract:
Cigarette smoke contains volatile organic compounds (VOCs), such as acrylamide, 1,3-butadiene, and benzene. This study aimed to explore the associations between the urinary levels of cotinine and VOC metabolites and the risk of urothelial carcinoma (UC). A hospital-based case–control study involving two groups matched on the basis of age ( ± 3 years) and gender was designed. UC was clinically diagnosed through urological examinations and pathologically verified. Smoking-related information was collected through questionnaires and face-to-face interviews with all study participants. Urine samples were collected for the analysis of the urinary levels of VOC metabolites, cotinine, and 8-hydroxydeoxygua- nosine (8-OHdG), which was selected as a proxy of oxidative stress. Multiple logistic regressions were applied to estimate the risk of UC. The urinary cotinine and 8-OHdG levels of the UC group were higher than those of the control group. The urinary levels of VOC metabolites, including N-acetyl-S-(2-carbamoylethyl)-L-cysteine (AAMA), N- acetyl-S-(2-carbamoyl-2-hydroxyethyl)-L-cysteine, N-acetyl-S- (4- hydroxy-2-buten-1-yl)-Lcysteine-3, trans, trans-muconic acid (t,t- MA), and S-phenylmercapturic acid (SPMA) increased as the urinary levels of cotinine increased. Relevant dose-response relationships between the risk of UC risk and the urinary levels of AAMA , t,t-MA, SPMA, and 8-OHdG were found after adjusting for potential risk factors. The UC risk of participants with high urinary levels of cotinine, AAMA, t,t-MA, SPMA, and 8-OHdG were 3.5–6-fold higher than those of other participants. Increased urinary levels of VOC metabolites were associated with smoking-related UC risk. The development of UC should be explored in large-scale in vitro or in vivo studies with the repeated measurement of VOC metabolites.Keywords: volatile organic compound, urothelial carcinoma, cotinine, 8-hydroxydeoxyguanosine
Procedia PDF Downloads 142136 House Price Index Predicts a Larger Impact of Habitat Loss than Primary Productivity on the Biodiversity of North American Avian Communities
Authors: Marlen Acosta Alamo, Lisa Manne, Richard Veit
Abstract:
Habitat loss due to land use change is one of the leading causes of biodiversity loss worldwide. This form of habitat loss is a non-random phenomenon since the same environmental factors that make an area suitable for supporting high local biodiversity overlap with those that make it attractive for urban development. We aimed to compare the effect of two non-random habitat loss predictors on the richness, abundance, and rarity of nature-affiliated and human-affiliated North American breeding birds. For each group of birds, we simulated the non-random habitat loss using two predictors: the House Price Index as a measure of the attractiveness of an area for humans and the Normalized Difference Vegetation Index as a proxy for primary productivity. We compared the results of the two non-random simulation sets and one set of random habitat loss simulations using an analysis of variance and followed up with a Tukey-Kramer test when appropriate. The attractiveness of an area for humans predicted estimates of richness loss and increase of rarity higher than primary productivity and random habitat loss for nature-affiliated and human-affiliated birds. For example, at 50% of habitat loss, the attractiveness of an area for humans produced estimates of richness at least 5% lower and of a rarity at least 40% higher than primary productivity and random habitat loss for both groups of birds. Only for the species abundance of nature-affiliated birds, the attractiveness of an area for humans did not outperform primary productivity as a predictor of biodiversity following habitat loss. We demonstrated the value of the House Price Index, which can be used in conservation assessments as an index of the risks of habitat loss for natural communities. Thus, our results have relevant implications for sustainable urban land-use planning practices and can guide stakeholders and developers in their efforts to conserve local biodiversity.Keywords: biodiversity loss, bird biodiversity, house price index, non-random habitat loss
Procedia PDF Downloads 88135 Analysis of a Discrete-time Geo/G/1 Queue Integrated with (s, Q) Inventory Policy at a Service Facility
Authors: Akash Verma, Sujit Kumar Samanta
Abstract:
This study examines a discrete-time Geo/G/1 queueing-inventory system attached with (s, Q) inventory policy. Assume that the customers follow the Bernoulli process on arrival. Each customer demands a single item with arbitrarily distributed service time. The inventory is replenished by an outside supplier, and the lead time for the replenishment is determined by a geometric distribution. There is a single server and infinite waiting space in this facility. Demands must wait in the specified waiting area during a stock-out period. The customers are served on a first-come-first-served basis. With the help of the embedded Markov chain technique, we determine the joint probability distributions of the number of customers in the system and the number of items in stock at the post-departure epoch using the Matrix Analytic approach. We relate the system length distribution at post-departure and outside observer's epochs to determine the joint probability distribution at the outside observer's epoch. We use probability distributions at random epochs to determine the waiting time distribution. We obtain the performance measures to construct the cost function. The optimum values of the order quantity and reordering point are found numerically for the variety of model parameters.Keywords: discrete-time queueing inventory model, matrix analytic method, waiting-time analysis, cost optimization
Procedia PDF Downloads 45