Search results for: centralized server
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 476

Search results for: centralized server

86 A Lightweight Blockchain: Enhancing Internet of Things Driven Smart Buildings Scalability and Access Control Using Intelligent Direct Acyclic Graph Architecture and Smart Contracts

Authors: Syed Irfan Raza Naqvi, Zheng Jiangbin, Ahmad Moshin, Pervez Akhter

Abstract:

Currently, the IoT system depends on a centralized client-servant architecture that causes various scalability and privacy vulnerabilities. Distributed ledger technology (DLT) introduces a set of opportunities for the IoT, which leads to practical ideas for existing components at all levels of existing architectures. Blockchain Technology (BCT) appears to be one approach to solving several IoT problems, like Bitcoin (BTC) and Ethereum, which offer multiple possibilities. Besides, IoTs are resource-constrained devices with insufficient capacity and computational overhead to process blockchain consensus mechanisms; the traditional BCT existing challenge for IoTs is poor scalability, energy efficiency, and transaction fees. IOTA is a distributed ledger based on Direct Acyclic Graph (DAG) that ensures M2M micro-transactions are free of charge. IOTA has the potential to address existing IoT-related difficulties such as infrastructure scalability, privacy and access control mechanisms. We proposed an architecture, SLDBI: A Scalable, lightweight DAG-based Blockchain Design for Intelligent IoT Systems, which adapts the DAG base Tangle and implements a lightweight message data model to address the IoT limitations. It enables the smooth integration of new IoT devices into a variety of apps. SLDBI enables comprehensive access control, energy efficiency, and scalability in IoT ecosystems by utilizing the Masked Authentication Message (MAM) protocol and the IOTA Smart Contract Protocol (ISCP). Furthermore, we suggest proof-of-work (PoW) computation on the full node in an energy-efficient way. Experiments have been carried out to show the capability of a tangle to achieve better scalability while maintaining energy efficiency. The findings show user access control management at granularity levels and ensure scale up to massive networks with thousands of IoT nodes, such as Smart Connected Buildings (SCBDs).

Keywords: blockchain, IOT, direct acyclic graphy, scalability, access control, architecture, smart contract, smart connected buildings

Procedia PDF Downloads 122
85 Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control

Authors: Vinko Lešić, Mario Vašak, Anita Martinčević, Marko Gulin, Antonio Starčić, Hrvoje Novak

Abstract:

With 40% of total world energy consumption, building systems are developing into technically complex large energy consumers suitable for application of sophisticated power management approaches to largely increase the energy efficiency and even make them active energy market participants. Centralized control system of building heating and cooling managed by economically-optimal model predictive control shows promising results with estimated 30% of energy efficiency increase. The research is focused on implementation of such a method on a case study performed on two floors of our faculty building with corresponding sensors wireless data acquisition, remote heating/cooling units and central climate controller. Building walls are mathematically modeled with corresponding material types, surface shapes and sizes. Models are then exploited to predict thermal characteristics and changes in different building zones. Exterior influences such as environmental conditions and weather forecast, people behavior and comfort demands are all taken into account for deriving price-optimal climate control. Finally, a DC microgrid with photovoltaics, wind turbine, supercapacitor, batteries and fuel cell stacks is added to make the building a unit capable of active participation in a price-varying energy market. Computational burden of applying model predictive control on such a complex system is relaxed through a hierarchical decomposition of the microgrid and climate control, where the former is designed as higher hierarchical level with pre-calculated price-optimal power flows control, and latter is designed as lower level control responsible to ensure thermal comfort and exploit the optimal supply conditions enabled by microgrid energy flows management. Such an approach is expected to enable the inclusion of more complex building subsystems into consideration in order to further increase the energy efficiency.

Keywords: price-optimal building climate control, Microgrid power flow optimisation, hierarchical model predictive control, energy efficient buildings, energy market participation

Procedia PDF Downloads 465
84 Subsidying Local Health Policy Programs as a Public Management Tool in the Polish Health Care System

Authors: T. Holecki, J. Wozniak-Holecka, P. Romaniuk

Abstract:

Due to the highly centralized model of financing health care in Poland, local self-government rarely undertook their own initiatives in the field of public health, particularly health promotion. However, since 2017 the possibility of applying for a subsidy to health policy programs has been allowed, with the additional resources to be retrieved from the National Health Fund, which is the dominant payer in the health system. The amount of subsidy depends on the number of inhabitants in a given unit and ranges about 40% of the total cost of the program. The aim of this paper is to assess the impact of newly implemented solutions in financing health policy on the management of public finances, as well as on the activity provided by local self-government in health promotion. An effort to estimate the amount of expenses that both local governments, and the National Health Fund, spent on local health policy programs while implementing the new solutions. The research method is the analysis of financial data obtained from the National Health Fund and from local government units, as well as reports published by the Agency for Health Technology Assessment and Pricing, which holds substantive control over the health policy programs, and releases permission for their implementation. The study was based on a comparative analysis of expenditures on the implementation of health programs in Poland in years 2010-2018. The presentation of the results includes the inclusion of average annual expenditures of local government units per 1 inhabitant, the total number of positively evaluated applications and the percentage share in total expenditures of local governments (16 voivodships areas). The most essential purpose is to determine whether the assumptions of the subsidy program are working correctly in practice, and what are the real effects of introducing legislative changes into local government levels in the context of public health tasks. The assumption of the study was that the use of a new motivation tool in the field of public management would result in multiplication of resources invested in the provision of health policy programs. Preliminary conclusions show that financial expenditures changed significantly after the introduction of public funding at the level of 40%, obtaining an increase in funding from own funds of local governments at the level of 80 to 90%.

Keywords: health care system, health policy programs, local self-governments, public health management

Procedia PDF Downloads 156
83 “Environmental-Friendly” and “People-Friendly” Project for a New North-East Italian Hospital

Authors: Emanuela Zilli, Antonella Ruffatto, Davide Bonaldo, Stefano Bevilacqua, Tommaso Caputo, Luisa Fontana, Carmelina Saraceno, Antonio Sturaroo, Teodoro Sava, Antonio Madia

Abstract:

The new Hospital in Cittadella - ULSS 6 Euganea Health Trust, in the North-East of Italy (400 beds, project completion date in 2026), will partially take the place of the existing building. Interesting features have been suggested in order to project a modern, “environmental-friendly” and “people-friendly” building. Specific multidisciplinary meetings (involving stakeholders and professionals with different backgrounds) have been organized on a periodic basis in order to guarantee the appropriate implementation of logistic and organizational solutions related to eco-sustainability, integration with the context, and the concept of “design for all” and “humanization of care.” The resulting building will be composed of organic shapes determined by the external environment (sun movement, climate, landscape, pre-existing buildings, roads) and the needs of the internal environment (areas of care and diagnostic-treatment paths reorganized with experience gained during the pandemic), with extensive use of renewable energy, solar panels, a 4th-generation heating system, sanitised and maintainable surfaces. There is particular attention to the quality of the staff areas, which include areas dedicated to psycho-physical well-being (relax points, yoga gym), study rooms, and a centralized conference room. Outdoor recreational spaces and gardens for music and watercolour therapy will be included; atai-chi gym is dedicated to oncology patients. Integration in the urban and social context is emphasized through window placement toward the gardens (maternal-infant, mental health, and rehabilitation wards). Service areas such as dialysis, radiology, and labs have views of the medieval walls, the symbol of the city’s history. The new building has been designed to pursue the maximum level of eco-sustainability, harmony with the environment, and integration with the historical, urban, and social context; the concept of humanization of care has been considered in all the phases of the project management.

Keywords: environmental-friendly, humanization, eco-sustainability, new hospital

Procedia PDF Downloads 118
82 Multi Data Management Systems in a Cluster Randomized Trial in Poor Resource Setting: The Pneumococcal Vaccine Schedules Trial

Authors: Abdoullah Nyassi, Golam Sarwar, Sarra Baldeh, Mamadou S. K. Jallow, Bai Lamin Dondeh, Isaac Osei, Grant A. Mackenzie

Abstract:

A randomized controlled trial is the "gold standard" for evaluating the efficacy of an intervention. Large-scale, cluster-randomized trials are expensive and difficult to conduct, though. To guarantee the validity and generalizability of findings, high-quality, dependable, and accurate data management systems are necessary. Robust data management systems are crucial for optimizing and validating the quality, accuracy, and dependability of trial data. Regarding the difficulties of data gathering in clinical trials in low-resource areas, there is a scarcity of literature on this subject, which may raise concerns. Effective data management systems and implementation goals should be part of trial procedures. Publicizing the creative clinical data management techniques used in clinical trials should boost public confidence in the study's conclusions and encourage further replication. In the ongoing pneumococcal vaccine schedule study in rural Gambia, this report details the development and deployment of multi-data management systems and methodologies. We implemented six different data management, synchronization, and reporting systems using Microsoft Access, RedCap, SQL, Visual Basic, Ruby, and ASP.NET. Additionally, data synchronization tools were developed to integrate data from these systems into the central server for reporting systems. Clinician, lab, and field data validation systems and methodologies are the main topics of this report. Our process development efforts across all domains were driven by the complexity of research project data collected in real-time data, online reporting, data synchronization, and ways for cleaning and verifying data. Consequently, we effectively used multi-data management systems, demonstrating the value of creative approaches in enhancing the consistency, accuracy, and reporting of trial data in a poor resource setting.

Keywords: data management, data collection, data cleaning, cluster-randomized trial

Procedia PDF Downloads 27
81 Anaerobic Digestion Batch Study of Taxonomic Variations in Microbial Communities during Adaptation of Consortium to Different Lignocellulosic Substrates Using Targeted Sequencing

Authors: Priyanka Dargode, Suhas Gore, Manju Sharma, Arvind Lali

Abstract:

Anaerobic digestion has been widely used for production of methane from different biowastes. However, the complexity of microbial communities involved in the process is poorly understood. The performance of biogas production process concerning the process productivity is closely coupled to its microbial community structure and syntrophic interactions amongst the community members. The present study aims at understanding taxonomic variations occurring in any starter inoculum when acclimatised to different lignocellulosic biomass (LBM) feedstocks relating to time of digestion. The work underlines use of high throughput Next Generation Sequencing (NGS) for validating the changes in taxonomic patterns of microbial communities. Biomethane Potential (BMP) batches were set up with different pretreated and non-pretreated LBM residues using the same microbial consortium and samples were withdrawn for studying the changes in microbial community in terms of its structure and predominance with respect to changes in metabolic profile of the process. DNA of samples withdrawn at different time intervals with reference to performance changes of the digestion process, was extracted followed by its 16S rRNA amplicon sequencing analysis using Illumina Platform. Biomethane potential and substrate consumption was monitored using Gas Chromatography(GC) and reduction in COD (Chemical Oxygen Demand) respectively. Taxonomic analysis by QIIME server data revealed that microbial community structure changes with different substrates as well as at different time intervals. It was observed that biomethane potential of each substrate was relatively similar but, the time required for substrate utilization and its conversion to biomethane was different for different substrates. This could be attributed to the nature of substrate and consequently the discrepancy between the dominance of microbial communities with regards to different substrate and at different phases of anaerobic digestion process. Knowledge of microbial communities involved would allow a rational substrate specific consortium design which will help to reduce consortium adaptation period and enhance the substrate utilisation resulting in improved efficacy of biogas process.

Keywords: amplicon sequencing, biomethane potential, community predominance, taxonomic analysis

Procedia PDF Downloads 532
80 Gilgel Gibe III: Dam-Induced Displacement in Ethiopia and Kenya

Authors: Jonny Beirne

Abstract:

Hydropower developments have come to assume an important role within the Ethiopian government's overall development strategy for the country during the last ten years. The Gilgel Gibe III on the Omo river, due to become operational in September 2014, represents the most ambitious, and controversial, of these projects to date. Further aspects of the government's national development strategy include leasing vast areas of designated 'unused' land for large-scale commercial agricultural projects and 'voluntarily' villagizing scattered, semi-nomadic agro-pastoralist groups to centralized settlements so as to use land and water more efficiently and to better provide essential social services such as education and healthcare. The Lower Omo valley, along the Omo River, is one of the sites of this villagization programme as well as of these large-scale commercial agricultural projects which are made possible owing to the regulation of the river's flow by Gibe III. Though the Ethiopian government cite many positive aspects of these agricultural and hydropower developments there are still expected to be serious regional and transnational effects, including on migration flows, in an area already characterized by increasing climatic vulnerability with attendant population movements and conflicts over scarce resources. The following paper is an attempt to track actual and anticipated migration flows resulting from the construction of Gibe III in the immediate vicinity of the dam, downstream in the Lower Omo Valley and across the border in Kenya around Lake Turkana. In the case of those displaced in the Lower Omo Valley, this will be considered in view of the distinction between voluntary villagization and forced resettlement. The research presented is not primary-source material. Instead, it is drawn from the reports and assessments of the Ethiopian government, rights-based groups, and academic researchers as well as media articles. It is hoped that this will serve to draw greater attention to the issue and encourage further methodological research on the dynamics of dam constructions (and associated large-scale irrigation schemes) on migration flows and on the ultimate experience of displacement and resettlement for environmental migrants in the region.

Keywords: forced displacement, voluntary resettlement, migration, human rights, human security, land grabs, dams, commercial agriculture, pastoralism, ecosystem modification, natural resource conflict, livelihoods, development

Procedia PDF Downloads 381
79 Ethiopian Textile and Apparel Industry: Study of the Information Technology Effects in the Sector to Improve Their Integrity Performance

Authors: Merertu Wakuma Rundassa

Abstract:

Global competition and rapidly changing customer requirements are forcing major changes in the production styles and configuration of manufacturing organizations. Increasingly, traditional centralized and sequential manufacturing planning, scheduling, and control mechanisms are being found insufficiently flexible to respond to changing production styles and highly dynamic variations in product requirements. The traditional approaches limit the expandability and reconfiguration capabilities of the manufacturing systems. Thus many business houses face increasing pressure to lower production cost, improve production quality and increase responsiveness to customers. In a textile and apparel manufacturing, globalization has led to increase in competition and quality awareness and these industries have changed tremendously in the last few years. So, to sustain competitive advantage, companies must re-examine and fine-tune their business processes to deliver high quality goods at very low costs and it has become very important for the textile and apparel industries to integrate themselves with information technology to survive. IT can create competitive advantages for companies to improve coordination and communication among trading partners, increase the availability of information for intermediaries and customers and provide added value at various stages along the entire chain. Ethiopia is in the process of realizing its potential as the future sourcing location for the global textile and garments industry. With a population of over 90 million people and the fastest growing non-oil economy in Africa, Ethiopia today represents limitless opportunities for international investors. For the textile and garments industry Ethiopia promises a low cost production location with natural resources such as cotton to enable the setup of vertically integrated textile and garment operation. However; due to lack of integration of their business activities textile and apparel industry of Ethiopia faced a problem in that it can‘t be competent in the global market. On the other hand the textile and apparel industries of other countries have changed tremendously in the last few years and globalization has led to increase in competition and quality awareness. So the aim of this paper is to study the trend of Ethiopian Textile and Apparel Industry on the application of different IT system to integrate them in the global market.

Keywords: information technology, business integrity, textile and apparel industries, Ethiopia

Procedia PDF Downloads 363
78 Use and Effects of Kanban Board from the Aspects of Brothers Furniture Limited

Authors: Kazi Rizvan, Yamin Rekhu

Abstract:

Due to high competitiveness in industries throughout the world, every industry is trying hard to utilize all their resources to keep their productivity as high as possible. Many tools have been being used to ensure smoother flow of an operation, to balance tasks, to maintain proper schedules for tasks, to maintain proper sequence for tasks, to reduce unproductive time. All of these tools are used to augment productivity within an industry. Kanban board is one of them and of the many important tools of lean production system. Kanban Board is a visual depiction of the status of tasks. Kanban board shows the actual status of the tasks. It conveys the progress and issues of tasks as well. Using Kanban Board, tasks can be distributed among workers and operation targets can be visually represented to them. In this paper, an example of Kanban board from the aspects of Brothers Furniture Limited was taken and how the Kanban board system was implemented, how the board was designed and how it was made easily perceivable for the less literate or illiterate workers. The Kanban board was designed for the packing section of Brothers Furniture Limited. It was implemented for the purpose of representing the tasks flow to the workers and to mitigate the time that was wasted while the workers remained wondering about what task they should start after they finish one. Kanban board subsumed seven columns and there was a column for comments where if any problem occurred during working on the tasks. Kanban board was helpful for the workers as the board showed the urgency of the tasks. It was also helpful for the store section as they could understand which products and how much of them could be delivered to store at any certain time. Kanban board had all the information centralized which is why the work-flow got paced up and idle time was minimized. Regardless of many workers being illiterate or less literate, Kanban board was still explicable for the workers as the Kanban cards were colored. Since the significance of colors can be conveniently interpretable to them, colored cards helped a great deal in that matter. Hence, the illiterate or less literate workers didn’t have to spend time wondering about the significance of the cards. Even when the workers weren’t told the significance of the colored cards, they could grow a feeling about their meaning as colors can trigger anyone’s mind to perceive the situation. As a result, the board elucidated the workers about what board required them to do, when to do and what to do next. Kanban board alleviated excessive time between tasks by setting day-plan for targeted tasks and it also reduced time during tasks as the workers were acknowledged of forthcoming tasks for a day. Being very specific to the tasks, Kanban board helped the workers become more focused on their tasks helped them do their job with more perfection. As a result, The Kanban board helped achieve a 8.75% increase in productivity than the productivity before the Kanban board was implemented.

Keywords: color, Kanban Board, Lean Tool, literacy, packing, productivity

Procedia PDF Downloads 233
77 Adopting the Community Health Workers Master List Registry for Community Health Workforce in Kenya

Authors: Gikunda Aloise, Mjema Saida, Barasa Herbert, Wanyungu John, Kimani Maureen

Abstract:

Background: Community Health Workforce (CHW) is health care providers at the community level (Level 1) and serves as a bridge between the community and the formal healthcare system. This human resource has enormous potential to extend healthcare services and ensures that the vulnerable, marginalized, and hard-to-reach populations have access to quality healthcare services at the community and primary health facility levels. However, these cadres are neither recognized, remunerated, nor in most instances, registered in a master list. Management and supervision of CHWs is not easy if their individual demographics, training capacity and incentives is not well documented through a centralized registry. Description: In February 2022, Amref supported the Kenya Ministry of Health in developing a community health workforce database called Community Health Workers Master List Registry (CHWML), which is hosted in Kenya Health Information System (KHIS) tracker. CHW registration exercise was through a sensitization meeting conducted by the County Community Health Focal Person for the Sub-County Community Health Focal Person and Community Health Assistants who uploaded information on individual demographics, training undertaken and incentives received by CHVs. Care was taken to ensure compliance with Kenyan laws on the availability and use of personal data as prescribed by the Data Protection Act, 2019 (DPA). Results and lessons learnt: By June 2022, 80,825 CHWs had been registered in the system; 78,174 (96%) CHVs and 2,636 (4%) CHAs. 25,235 (31%) are male, 55,505 (68%) are female & 85 (1%) are transgender. 39,979. (49%) had secondary education and 2500 (3%) had no formal education. Only 27 641 (34%) received a monthly stipend. 68,436 CHVs (85%) had undergone basic training. However, there is a need to validate the data to align with the current situation in the counties. Conclusions/Next steps: The use of CHWML will unlock opportunities for building more resilient and sustainable health systems and inform financial planning, resource allocation, capacity development, and quality service delivery. The MOH will update the CHWML guidelines in adherence to the data protection act which will inform standard procedures for maintaining, updating the registry and integrate Community Health Workforce registry with the HRH system.

Keywords: community health registry, community health volunteers (CHVs), community health workers masters list (CHWML), data protection act

Procedia PDF Downloads 140
76 Smart Automated Furrow Irrigation: A Preliminary Evaluation

Authors: Jasim Uddin, Rod Smith, Malcolm Gillies

Abstract:

Surface irrigation is the most popular irrigation method all over the world. However, two issues: low efficiency and huge labour involvement concern irrigators due to scarcity in recent years. To address these issues, a smart automated furrow is conceptualised that can be operated using digital devices like smartphone, iPad or computer and a preliminary evaluation was conducted in this study. The smart automated system is the integration of commercially available software and hardware. It includes real-time surface irrigation optimisation software (SISCO) and Rubicon Water’s surface irrigation automation hardware and software. The automated system consists of automatic water delivery system with 300 mm flexible pipes attached to both sides of a remotely controlled valve to operate the irrigation. A water level sensor to obtain the real-time inflow rate from the measured head in the channel, advance sensors to measure the advance time to particular points of an irrigated field, a solar-powered telemetry system including a base station to communicate all the field sensors with the main server. On the basis of field data, the software (SISCO) is optimised the ongoing irrigation and determine the optimum cut-off for particular irrigation and send this information to the control valve to stop the irrigation in a particular (cut-off) time. The preliminary evaluation shows that the automated surface irrigation worked reasonably well without manual intervention. The evaluation of farmers managed irrigation events show the potentials to save a significant amount of water and labour. A substantial amount of economic and social benefits are expected in rural industries by adopting this system. The future outcome of this work would be a fully tested commercial adaptive real-time furrow irrigation system able to compete with the pressurised alternative of centre pivot or lateral move machines on capital cost, water and labour savings but without the massive energy costs.

Keywords: furrow irrigation, smart automation, infiltration, SISCO, real-time irrigation, adoptive control

Procedia PDF Downloads 452
75 Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.

Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure

Procedia PDF Downloads 111
74 Secure Data Sharing of Electronic Health Records With Blockchain

Authors: Kenneth Harper

Abstract:

The secure sharing of Electronic Health Records (EHRs) is a critical challenge in modern healthcare, demanding solutions to enhance interoperability, privacy, and data integrity. Traditional standards like Health Information Exchange (HIE) and HL7 have made significant strides in facilitating data exchange between healthcare entities. However, these approaches rely on centralized architectures that are often vulnerable to data breaches, lack sufficient privacy measures, and have scalability issues. This paper proposes a framework for secure, decentralized sharing of EHRs using blockchain technology, cryptographic tokens, and Non-Fungible Tokens (NFTs). The blockchain's immutable ledger, decentralized control, and inherent security mechanisms are leveraged to improve transparency, accountability, and auditability in healthcare data exchanges. Furthermore, we introduce the concept of tokenizing patient data through NFTs, creating unique digital identifiers for each record, which allows for granular data access controls and proof of data ownership. These NFTs can also be employed to grant access to authorized parties, establishing a secure and transparent data sharing model that empowers both healthcare providers and patients. The proposed approach addresses common privacy concerns by employing privacy-preserving techniques such as zero-knowledge proofs (ZKPs) and homomorphic encryption to ensure that sensitive patient information can be shared without exposing the actual content of the data. This ensures compliance with regulations like HIPAA and GDPR. Additionally, the integration of Fast Healthcare Interoperability Resources (FHIR) with blockchain technology allows for enhanced interoperability, enabling healthcare organizations to exchange data seamlessly and securely across various systems while maintaining data governance and regulatory compliance. Through real-world case studies and simulations, this paper demonstrates how blockchain-based EHR sharing can reduce operational costs, improve patient outcomes, and enhance the security and privacy of healthcare data. This decentralized framework holds great potential for revolutionizing healthcare information exchange, providing a transparent, scalable, and secure method for managing patient data in a highly regulated environment.

Keywords: blockchain, electronic health records (ehrs), fast healthcare interoperability resources (fhir), health information exchange (hie), hl7, interoperability, non-fungible tokens (nfts), privacy-preserving techniques, tokens, secure data sharing,

Procedia PDF Downloads 22
73 Work-Life Balance: A Landscape Mapping of Two Decades of Scholarly Research

Authors: Gertrude I Hewapathirana, Mohamed M. Moustafa, Michel G. Zaitouni

Abstract:

The purposes of this research are: (a) to provide an epistemological and ontological understanding of the WLB theory, practice, and research to illuminate how the WLB evolved between 2000 to 2020 and (b) to analyze peer-reviewed research to identify the gaps, hotspots, underlying dynamics, theoretical and thematic trends, influential authors, research collaborations, geographic networks, and the multidisciplinary nature of the WLB theory to guide future researchers. The research used four-step bibliometric network analysis to explore five research questions. Using keywords such as WLB and associated variants, 1190 peer-reviewed articles were extracted from the Scopus database and transformed to a plain text format for filtering. The analysis was conducted using the R version 4.1 software (R Development Core Team, 2021) and several libraries such as bibliometrics, word cloud, and ggplot2. We used the VOSviewer software (van Eck & Waltman, 2019) for network visualization. The WLB theory has grown into a multifaceted, multidisciplinary field of research. There is a paucity of research between 2000 to 2005 and an exponential growth from 2006 to 2015. The rapid increase of WLB research in the USA, UK, and Australia reflects the increasing workplace stresses due to hyper competitive workplaces, inflexible work systems, and increasing diversity and the emergence of WLB support mechanisms, legal and constitutional mandates to enhance employee and family wellbeing at multilevel social systems. A severe knowledge gap exists due to inadequate publications disseminating the "core" WLB research. "Locally-centralized-globally-discrete" collaboration among researchers indicates a "North-South" divide between developed and developing nations. A shortage in WLB research in developing nations and a lack of research collaboration hinder a global understanding of the WLB as a universal phenomenon. Policymakers and practitioners can use the findings to initiate supporting policies, and innovative work systems. The boundary expansion of the WLB concepts, categories, relations, and properties would facilitate researchers/theoreticians to test a variety of new dimensions. This is the most comprehensive WLB landscape analysis that reveals emerging trends, concepts, networks, underlying dynamics, gaps, and growing theoretical and disciplinary boundaries. It portrays the WLB as a universal theory.

Keywords: work-life balance, co-citation networks; keyword co-occurrence network, bibliometric analysis

Procedia PDF Downloads 196
72 Thinking about the Loss of Social Networking Sites May Expand the Distress of Social Exclusion

Authors: Wen-Bin Chiou, Hsiao-Chiao Weng

Abstract:

Social networking sites (SNS) such as Facebook and Twitter are low-cost tools that can promote the creation of social connections by providing a convenient platform that can be accessed at any time. In the current research, a laboratory experiment was conducted test the hypothesis that reminders of losing SNS would alter the impact of social events, especially those involving social exclusion. Specifically, this study explored whether losing SNS would intensify perceived social distress induced by exclusionary bogus feedback. Eighty-eight Facebook users (46 females, 42 males; mean age = 22.6 years, SD = 3.1 years) were recruited via campus posters and flyers at a national university in southern Taiwan. After participants provided consent, they were randomly assigned to a 2 (SNS non-use vs. neutral) between-subjects experiment. Participants completed an ostensible survey about online social networking in which we included an item about the time spent on SNS per day. The last question was used to manipulate thoughts about losing SNS access. Participants under the non-use condition were asked to record three conditions that would render them unable to use SNS (e.g., a network adaptor problem, malfunctioning cable modem, or problems with Internet service providers); participants under the neutral condition recorded three conditions that would render them unable to log onto the college website (e.g., server maintenance, local network or firewall problems). Later, this experiment employed a bogus-feedback paradigm to induce social exclusion. Participants then rated their social distress on a four-item scale, identical to that of Experiment 1 (α = .84). The results showed that thoughts of losing SNS intensified distress caused by social exclusion, suggesting that the loss of SNS has a similar effect to the loss of a primary source for social reconnections. Moreover, the priming effects of SNS on perceived distress were more prominent for heavy users. The demonstrated link between the idea of losing SNS use and increased pain of social exclusion manifests the importance of SNS as a crucial gateway for acquiring and rebuilding social connections. Use of online social networking appears to be a two-edged sword for coping with social exclusion in human lives in the e-society.

Keywords: online social networking, perceived distress, social exclusion, SNS

Procedia PDF Downloads 420
71 The Psychometric Properties of an Instrument to Estimate Performance in Ball Tasks Objectively

Authors: Kougioumtzis Konstantin, Rylander Pär, Karlsteen Magnus

Abstract:

Ball skills as a subset of fundamental motor skills are predictors for performance in sports. Currently, most tools evaluate ball skills utilizing subjective ratings. The aim of this study was to examine the psychometric properties of a newly developed instrument to objectively measure ball handling skills (BHS-test) utilizing digital instrument. Participants were a convenience sample of 213 adolescents (age M = 17.1 years, SD =3.6; 55% females, 45% males) recruited from upper secondary schools and invited to a sports hall for the assessment. The 8-item instrument incorporated both accuracy-based ball skill tests and repetitive-performance tests with a ball. Testers counted performance manually in the four tests (one throwing and three juggling tasks). Furthermore, assessment was technologically enhanced in the other four tests utilizing a ball machine, a Kinect camera and balls with motion sensors (one balancing and three rolling tasks). 3D printing technology was used to construct equipment, while all results were administered digitally with smart phones/tablets, computers and a specially constructed application to send data to a server. The instrument was deemed reliable (α = .77) and principal component analysis was used in a random subset (53 of the participants). Furthermore, latent variable modeling was employed to confirm the structure with the remaining subset (160 of the participants). The analysis showed good factorial-related validity with one factor explaining 57.90 % of the total variance. Four loadings were larger than .80, two more exceeded .76 and the other two were .65 and .49. The one factor solution was confirmed by a first order model with one general factor and an excellent fit between model and data (χ² = 16.12, DF = 20; RMSEA = .00, CI90 .00–.05; CFI = 1.00; SRMR = .02). The loadings on the general factor ranged between .65 and .83. Our findings indicate good reliability and construct validity for the BHS-test. To develop the instrument further, more studies are needed with various age-groups, e.g. children. We suggest using the BHS-test for diagnostic or assessment purpose for talent development and sports participation interventions that focus on ball games.

Keywords: ball-handling skills, ball-handling ability, technologically-enhanced measurements, assessment

Procedia PDF Downloads 94
70 Pareto Optimal Material Allocation Mechanism

Authors: Peter Egri, Tamas Kis

Abstract:

Scheduling problems have been studied by the algorithmic mechanism design research from the beginning. This paper is focusing on a practically important, but theoretically rather neglected field: the project scheduling problem where the jobs connected by precedence constraints compete for various nonrenewable resources, such as materials. Although the centralized problem can be solved in polynomial-time by applying the algorithm of Carlier and Rinnooy Kan from the Eighties, obtaining materials in a decentralized environment is usually far from optimal. It can be observed in practical production scheduling situations that project managers tend to cache the required materials as soon as possible in order to avoid later delays due to material shortages. This greedy practice usually leads both to excess stocks for some projects and materials, and simultaneously, to shortages for others. The aim of this study is to develop a model for the material allocation problem of a production plant, where a central decision maker—the inventory—should assign the resources arriving at different points in time to the jobs. Since the actual due dates are not known by the inventory, the mechanism design approach is applied with the projects as the self-interested agents. The goal of the mechanism is to elicit the required information and allocate the available materials such that it minimizes the maximal tardiness among the projects. It is assumed that except the due dates, the inventory is familiar with every other parameters of the problem. A further requirement is that due to practical considerations monetary transfer is not allowed. Therefore a mechanism without money is sought which excludes some widely applied solutions such as the Vickrey–Clarke–Groves scheme. In this work, a type of Serial Dictatorship Mechanism (SDM) is presented for the studied problem, including a polynomial-time algorithm for computing the material allocation. The resulted mechanism is both truthful and Pareto optimal. Thus the randomization over the possible priority orderings of the projects results in a universally truthful and Pareto optimal randomized mechanism. However, it is shown that in contrast to problems like the many-to-many matching market, not every Pareto optimal solution can be generated with an SDM. In addition, no performance guarantee can be given compared to the optimal solution, therefore this approximation characteristic is investigated with experimental study. All in all, the current work studies a practically relevant scheduling problem and presents a novel truthful material allocation mechanism which eliminates the potential benefit of the greedy behavior that negatively influences the outcome. The resulted allocation is also shown to be Pareto optimal, which is the most widely used criteria describing a necessary condition for a reasonable solution.

Keywords: material allocation, mechanism without money, polynomial-time mechanism, project scheduling

Procedia PDF Downloads 333
69 Real-Time Radiological Monitoring of the Atmosphere Using an Autonomous Aerosol Sampler

Authors: Miroslav Hyza, Petr Rulik, Vojtech Bednar, Jan Sury

Abstract:

An early and reliable detection of an increased radioactivity level in the atmosphere is one of the key aspects of atmospheric radiological monitoring. Although the standard laboratory procedures provide detection limits as low as few µBq/m³, their major drawback is the delayed result reporting: typically a few days. This issue is the main objective of the HAMRAD project, which gave rise to a prototype of an autonomous monitoring device. It is based on the idea of sequential aerosol sampling using a carrousel sample changer combined with a gamma-ray spectrometer. In our hardware configuration, the air is drawn through a filter positioned on the carrousel so that it could be rotated into the measuring position after a preset sampling interval. Filter analysis is performed via a 50% HPGe detector inside an 8.5cm lead shielding. The spectrometer output signal is then analyzed using DSP electronics and Gamwin software with preset nuclide libraries and other analysis parameters. After the counting, the filter is placed into a storage bin with a capacity of 250 filters so that the device can run autonomously for several months depending on the preset sampling frequency. The device is connected to a central server via GPRS/GSM where the user can view monitoring data including raw spectra and technological data describing the state of the device. All operating parameters can be remotely adjusted through a simple GUI. The flow rate is continuously adjustable up to 10 m³/h. The main challenge in spectrum analysis is the natural background subtraction. As detection limits are heavily influenced by the deposited activity of radon decay products and the measurement time is fixed, there must exist an optimal sample decay time (delayed spectrum acquisition). To solve this problem, we adopted a simple procedure based on sequential spectrum acquisition and optimal partial spectral sum with respect to the detection limits for a particular radionuclide. The prototyped device proved to be able to detect atmospheric contamination at the level of mBq/m³ per an 8h sampling.

Keywords: aerosols, atmosphere, atmospheric radioactivity monitoring, autonomous sampler

Procedia PDF Downloads 150
68 Water Crisis or Crisis of Water Management: Assessing Water Governance in Iran

Authors: Sedigheh Kalantari

Abstract:

Like many countries in the arid and semi-arid belt, Iran experiences a natural limitation in the availability of water resources. However, rapid socioeconomic development has created a serious water crisis in a nation that was once one of the world’s pioneers in sustainable water management, due to the Persians’ contribution to hydraulic engineering inventions – the Qanat – throughout history. The exogenous issues like the changing climate, frequent droughts, and international sanctions are only crisis catalyzers, not the main cause of the water crisis; and a resilient water management system is expected to be capable of coping with these periodic external pressures. The current dramatic water security issues in Iran are rooted in managerial, political, and institutional challenges rather than engineering and technical issues, and the country is suffering from challenges in water governance. The country, instead of rigorous water conservation efforts, is still focused on supply-driven approach, technology and centralized methods, and structural solutions that aim to increase water supply; while the effectiveness of water governance and management has often left unused. To solve these issues, it is necessary to assess the present situation and its evolution over time. In this respect, establishing water governance assessment mechanisms will be a significant aspect of this paper. The research framework, however, is a conceptual framework to assess governance performance of Iran to critically diagnose problematic issues and areas, as well as proffer empirically based solutions and determine the best possible steps towards transformational processes. This concept aims to measure the adequacy of current solutions and strategies designed to ameliorate these problems and then develop and prescribe adequate futuristic solutions. Thus, the analytical framework developed in this paper seeks to provide insights on key factors influencing water governance in Iranian cities, institutional frameworks to manage water across scales and authorities, multi-level management gaps and policy responses, through an evidence-based approach and good practices to drive reform toward sustainability and water resource conservation. The findings of this paper show that the current structure of the water governance system in Iran, coupled with the lack of a comprehensive understanding of the root causes of the problem, leaves minimal hope for developing sustainable solutions to Iran’s increasing water crisis. In order to follow sustainable development approaches, Iran needs to replace symptom management with problem prevention.

Keywords: governance, Iran, sustainable development, water management, water resources

Procedia PDF Downloads 26
67 Contribution of Artificial Intelligence in the Studies of Natural Compounds Against SARS-COV-2

Authors: Salah Belaidi

Abstract:

We have carried out extensive and in-depth research to search for bioactive compounds based on Algerian plants. A selection of 50 ligands from Algerian medicinal plants. Several compounds used in herbal medicine have been drawn using Marvin Sketch software. We determined the three-dimensional structures of the ligands with the MMFF94 force field in order to prepare these ligands for molecular docking. The 3D protein structure of the SARS-CoV-2 main protease was taken from the Protein Data Bank. We used AutoDockVina software to apply molecular docking. The hydrogen atoms were added during the molecular docking process, and all the twist bonds of the ligands were added using the (ligand) module in the AutoDock software. The COVID-19 main protease (Mpro) is a key enzyme that plays a vital role in viral transcription and mediating replication, so it is a very attractive drug target for SARS-CoV-2. In this work, an evaluation was carried out on the biologically active compounds present in these selected medicinal plants as effective inhibitors of the protease enzyme of COVID-19, with an in-depth computational calculation of the molecular docking using the Autodock Vina software. The top 7 ligands: Phloroglucinol, Afzelin, Myricetin-3-O- rutinosidTricin 7-neohesperidoside, Silybin, Silychristinthat and Kaempferol are selected among the 50 molecules studied which are Algerian medicinal plants, whose selection is based on the best binding energy which is relatively low compared to the reference molecule with binding affinities of -9.3, -9.3, -9, -8.9, -8 .5, 8.3 and -8.3 kcal mol-1 respectively. Then, we analyzed the ADME properties of the best7 ligands using the web server SwissADME. Two ligands (Silybin, Silychristin) were found to be potential candidates for the discovery and design of novel drug inhibitors of the protease enzyme of SARS-CoV-2. The stability of the two ligands in complexing with the Mpro protease was validated by molecular dynamics simulation; they revealed a stable trajectory in both techniques, RMSD and RMSF, by showing molecular properties with coherent interactions in molecular dynamics simulations. Finally, we conclude that the Silybin ligand forms a more stable complex with the Mpro protease compared to the Silychristin ligand.

Keywords: COVID-19, medicinal plants, molecular docking, ADME properties, molecular dynamics

Procedia PDF Downloads 36
66 Pervasive Computing: Model to Increase Arable Crop Yield through Detection Intrusion System (IDS)

Authors: Idowu Olugbenga Adewumi, Foluke Iyabo Oluwatoyinbo

Abstract:

Presently, there are several discussions on the food security with increase in yield of arable crop throughout the world. This article, briefly present research efforts to create digital interfaces to nature, in particular to area of crop production in agriculture with increase in yield with interest on pervasive computing. The approach goes beyond the use of sensor networks for environmental monitoring but also by emphasizing the development of a system architecture that detect intruder (Intrusion Process) which reduce the yield of the farmer at the end of the planting/harvesting period. The objective of the work is to set a model for setting up the hand held or portable device for increasing the quality and quantity of arable crop. This process incorporates the use of infrared motion image sensor with security alarm system which can send a noise signal to intruder on the farm. This model of the portable image sensing device in monitoring or scaring human, rodent, birds and even pests activities will reduce post harvest loss which will increase the yield on farm. The nano intelligence technology was proposed to combat and minimize intrusion process that usually leads to low quality and quantity of produce from farm. Intranet system will be in place with wireless radio (WLAN), router, server, and client computer system or hand held device e.g PDAs or mobile phone. This approach enables the development of hybrid systems which will be effective as a security measure on farm. Since, precision agriculture has developed with the computerization of agricultural production systems and the networking of computerized control systems. In the intelligent plant production system of controlled greenhouses, information on plant responses, measured by sensors, is used to optimize the system. Further work must be carry out on modeling using pervasive computing environment to solve problems of agriculture, as the use of electronics in agriculture will attracts more youth involvement in the industry.

Keywords: pervasive computing, intrusion detection, precision agriculture, security, arable crop

Procedia PDF Downloads 403
65 Activation-TV® to Reduce Elderly Loneliness and Insecurity

Authors: Hannele Laaksonen, Seija Nyqvist, Kari Nurmes

Abstract:

Objectives: In the year 2011 the City of Vaasa started to develop know-how in the technology and the introduction of services for aging people in cooperation with the Polytechnic Novia University of Applied Sciences and VAMK, University of Applied Sciences. The project´s targets included: to help elderly people to maintain their ability to function, to provide them social and physical activities, to prevent their social exclusion, to decrease their feelings of loneliness and insecurity and to develop their technical know-how. Methods: The project was built based on open source code, tailor-made service system and user interface for the elderly living at home and their families, based on the users´ expectations and experiences of services. Activation-TV®-project vas carried out 1.4.2011-31.3.2014. A pilot group of eight elderly persons, who were living at home, were selected to the project. All necessary technical means as well as guidance and teaching equipment were provided to the pilot group. The students of University of Applied Sciences (VAMK, Novia) and employees of Center of Ageing were made all programs to the Activation-TV®. The project group were interviewed after and before intervention. The data were evaluated both qualitatively and quantitatively. Results: The built service includes a video library, a group room for interactive programs and a personal room for bilateral meetings and direct shipment. The program is bilingual and produced in both national languages. The Activation TV® reduced elderly peoples´ (n=8) feelings of emptiness, added mental well-being and quality of life with social contacts. Relatives felt, that they were able to get in to older peoples´ everyday life with Activation TV®. Discussion: The built application was tailored to the model that has not been developed elsewhere in Finland. This model can be copied from one server to another and thus transferred to other municipalities but the program requires its own personnel system management and maintenance as well as program production cooperation between the different actors. This service can be used for the elderly who are living at home without dementia.

Keywords: mental well-being, quality of life, elderly people, Finland

Procedia PDF Downloads 342
64 Analyzing Transit Network Design versus Urban Dispersion

Authors: Hugo Badia

Abstract:

This research answers which is the most suitable transit network structure to serve specific demand requirements in an increasing urban dispersion process. Two main approaches of network design are found in the literature. On the one hand, a traditional answer, widespread in our cities, that develops a high number of lines to connect most of origin-destination pairs by direct trips; an approach based on the idea that users averse to transfers. On the other hand, some authors advocate an alternative design characterized by simple networks where transfer is essential to complete most of trips. To answer which of them is the best option, we use a two-step methodology. First, by means of an analytical model, three basic network structures are compared: a radial scheme, starting point for the other two structures, a direct trip-based network, and a transfer-based one, which represent the two alternative transit network designs. The model optimizes the network configuration with regard to the total cost for each structure. For a scenario of dispersion, the best alternative is the structure with the minimum cost. This dispersion degree is defined in a simple way considering that only a central area attracts all trips. If this area is small, we have a high concentrated mobility pattern; if this area is too large, the city is highly decentralized. In this first step, we can determine the area of applicability for each structure in function to that urban dispersion degree. The analytical results show that a radial structure is suitable when the demand is so centralized, however, when this demand starts to scatter, new transit lines should be implemented to avoid transfers. If the urban dispersion advances, the introduction of more lines is no longer a good alternative, in this case, the best solution is a change of structure, from direct trips to a network based on transfers. The area of applicability of each network strategy is not constant, it depends on the characteristics of demand, city and transport technology. In the second step, we translate analytical results to a real case study by the relationship between the parameters of dispersion of the model and direct measures of dispersion in a real city. Two dimensions of the urban sprawl process are considered: concentration, defined by Gini coefficient, and centralization by area based centralization index. Once it is estimated the real dispersion degree, we are able to identify in which area of applicability the city is located. In summary, from a strategic point of view, we can obtain with this methodology which is the best network design approach for a city, comparing the theoretical results with the real dispersion degree.

Keywords: analytical network design model, network structure, public transport, urban dispersion

Procedia PDF Downloads 230
63 High Performance Computing Enhancement of Agent-Based Economic Models

Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna

Abstract:

This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).

Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process

Procedia PDF Downloads 128
62 Isolate-Specific Variations among Clinical Isolates of Brucella Identified by Whole-Genome Sequencing, Bioinformatics and Comparative Genomics

Authors: Abu S. Mustafa, Mohammad W. Khan, Faraz Shaheed Khan, Nazima Habibi

Abstract:

Brucellosis is a zoonotic disease of worldwide prevalence. There are at least four species and several strains of Brucella that cause human disease. Brucella genomes have very limited variation across strains, which hinder strain identification using classical molecular techniques, including PCR and 16 S rDNA sequencing. The aim of this study was to perform whole genome sequencing of clinical isolates of Brucella and perform bioinformatics and comparative genomics analyses to determine the existence of genetic differences across the isolates of a single Brucella species and strain. The draft sequence data were generated from 15 clinical isolates of Brucella melitensis (biovar 2 strain 63/9) using MiSeq next generation sequencing platform. The generated reads were used for further assembly and analysis. All the analysis was performed using Bioinformatics work station (8 core i7 processor, 8GB RAM with Bio-Linux operating system). FastQC was used to determine the quality of reads and low quality reads were trimmed or eliminated using Fastx_trimmer. Assembly was done by using Velvet and ABySS softwares. The ordering of assembled contigs was performed by Mauve. An online server RAST was employed to annotate the contigs assembly. Annotated genomes were compared using Mauve and ACT tools. The QC score for DNA sequence data, generated by MiSeq, was higher than 30 for 80% of reads with more than 100x coverage, which suggested that data could be utilized for further analysis. However when analyzed by FastQC, quality of four reads was not good enough for creating a complete genome draft so remaining 11 samples were used for further analysis. The comparative genome analyses showed that despite sharing same gene sets, single nucleotide polymorphisms and insertions/deletions existed across different genomes, which provided a variable extent of diversity to these bacteria. In conclusion, the next generation sequencing, bioinformatics, and comparative genome analysis can be utilized to find variations (point mutations, insertions and deletions) across different genomes of Brucella within a single strain. This information could be useful in surveillance and epidemiological studies supported by Kuwait University Research Sector grants MI04/15 and SRUL02/13.

Keywords: brucella, bioinformatics, comparative genomics, whole genome sequencing

Procedia PDF Downloads 383
61 Formal Development of Electronic Identity Card System Using Event-B

Authors: Tomokazu Nagata, Jawid Ahmad Baktash

Abstract:

The goal of this paper is to explore the use of formal methods for Electronic Identity Card System. Nowadays, one of the core research directions in a constantly growing distributed environment is the improvement of the communication process. The responsibility for proper verification becomes crucial. Formal methods can play an essential role in the development and testing of systems. The thesis presents two different methodologies for assessing correctness. Our first approach employs abstract interpretation techniques for creating a trace based model for Electronic Identity Card System. The model was used for building a semi decidable procedure for verifying the system model. We also developed the code for the eID System and can cover three parts login to system sending of Acknowledgment from user side, receiving of all information from server side and log out from system. The new concepts of impasse and spawned sessions that we introduced led our research to original statements about the intruder’s knowledge and eID system coding with respect to secrecy. Furthermore, we demonstrated that there is a bound on the number of sessions needed for the analysis of System.Electronic identity (eID) cards promise to supply a universal, nation-wide mechanism for user authentication. Most European countries have started to deploy eID for government and private sector applications. Are government-issued electronic ID cards the proper way to authenticate users of online services? We use the eID project as a showcase to discuss eID from an application perspective. The new eID card has interesting design features, it is contact-less, it aims to protect people’s privacy to the extent possible, and it supports cryptographically strong mutual authentication between users and services. Privacy features include support for pseudonymous authentication and per service controlled access to individual data items. The article discusses key concepts, the eID infrastructure, observed and expected problems, and open questions. The core technology seems ready for prime time and government projects deploy it to the masses. But application issues may hamper eID adoption for online applications.

Keywords: eID, event-B, Pro-B, formal method, message passing

Procedia PDF Downloads 237
60 Robotic Solution for Nuclear Facility Safety and Monitoring System

Authors: Altab Hossain, Shakerul Islam, Golamur R. Khan, Abu Zafar M. Salahuddin

Abstract:

An effective identification of breakdowns is of premier importance for the safe and reliable operation of Nuclear Power Plants (NPP) and its associated facilities. A great number of monitoring and diagnosis methodologies are applied and used worldwide in areas such as industry, automobiles, hospitals, and power plant to detect and reduce human disasters. The potential consequences of several hazardous activities may harm the society using nuclear and its associated facilities. Hence, one of the most popular and effective methods to ensure safety and monitor the entire nuclear facility and imply risk-free operation without human interference during the hazardous situation is using a robot. Therefore, in this study, an advanced autonomous robot has been designed and developed that can monitor several parameters in the NPP to ensure the safety and do some risky job in case of nuclear disaster. The robot consisted of autonomous track following unit, data processing and transmitting unit can follow a straight line and take turn as the bank greater than 90 degrees. The developed robot can analyze various parameters such as temperature, altitude, radiation, obstacle, humidity, detecting fire, measuring distance, ultrasonic scan and taking the heat of any particular object. It has an ability to broadcast live stream and can record the document to its own server memory. There is a separate control unit constructed with a baseboard which processes the recorded data and a transmitter which transmits the processed data. To make the robot user-friendly, the code is developed such a way that a user can control any of robotic arm as per types of work. To control at any place and without the track, there is an advanced code has been developed to take manual overwrite. Through this process, administrator who has logged in permission to Dynamic Host Client Protocol (DHCP) can make the handover of the control of the robot. In this process, this robot is provided maximum nuclear security from being hacked. Not only NPP, this robot can be used to maximize the real-time monitoring system of any nuclear facility as well as nuclear material transportation and decomposition system.

Keywords: nuclear power plant, radiation, dynamic host client protocol, nuclear security

Procedia PDF Downloads 209
59 Inertial Motion Capture System for Biomechanical Analysis in Rehabilitation and Sports

Authors: Mario Sandro F. Rocha, Carlos S. Ande, Anderson A. Oliveira, Felipe M. Bersotti, Lucas O. Venzel

Abstract:

The inertial motion capture systems (mocap) are among the most suitable tools for quantitative clinical analysis in rehabilitation and sports medicine. The inertial measuring units (IMUs), composed by accelerometers, gyroscopes, and magnetometers, are able to measure spatial orientations and calculate displacements with sufficient precision for applications in biomechanical analysis of movement. Furthermore, this type of system is relatively affordable and has the advantages of portability and independence from external references. In this work, we present the last version of our inertial motion capture system, based on the foregoing technology, with a unity interface designed for rehabilitation and sports. In our hardware architecture, only one serial port is required. First, the board client must be connected to the computer by a USB cable. Next, an available serial port is configured and opened to establish the communication between the client and the application, and then the client starts scanning for the active MOCAP_S servers around. The servers play the role of the inertial measuring units that capture the movements of the body and send the data to the client, which in turn create a package composed by the ID of the server, the current timestamp, and the motion capture data defined in the client pre-configuration of the capture session. In the current version, we can measure the game rotation vector (grv) and linear acceleration (lacc), and we also have a step detector that can be abled or disabled. The grv data are processed and directly linked to the bones of the 3D model, and, along with the data of lacc and step detector, they are also used to perform the calculations of displacements and other variables shown on the graphical user interface. Our user interface was designed to calculate and present variables that are important for rehabilitation and sports, such as cadence, speed, total gait cycle, gait cycle length, obliquity and rotation, and center of gravity displacement. Our goal is to present a low-cost portable and wearable system with a friendly interface for application in biomechanics and sports, which also performs as a product of high precision and low consumption of energy.

Keywords: biomechanics, inertial sensors, motion capture, rehabilitation

Procedia PDF Downloads 140
58 The Sea Striker: The Relevance of Small Assets Using an Integrated Conception with Operational Performance Computations

Authors: Gaëtan Calvar, Christophe Bouvier, Alexis Blasselle

Abstract:

This paper presents the Sea Striker, a compact hydrofoil designed with the goal to address some of the issues raised by the recent evolutions of naval missions, threats and operation theatres in modern warfare. Able to perform a wide range of operations, the Sea Striker is a 40-meter stealth surface combatant equipped with a gas turbine and aft and forward foils to reach high speeds. The Sea Striker's stealthiness is enabled by the combination of composite structure, exterior design, and the advanced integration of sensors. The ship is fitted with a powerful and adaptable combat system, ensuring a versatile and efficient response to modern threats. Lightly Manned with a core crew of 10, this hydrofoil is highly automated and can be remoted pilote for special force operation or transit. Such a kind of ship is not new: it has been used in the past by different navies, for example, by the US Navy with the USS Pegasus. Nevertheless, the recent evolutions in science and technologies on the one hand, and the emergence of new missions, threats and operation theatres, on the other hand, put forward its concept as an answer to nowadays operational challenges. Indeed, even if multiples opinions and analyses can be given regarding the modern warfare and naval surface operations, general observations and tendencies can be drawn such as the major increase in the sensors and weapons types and ranges and, more generally, capacities; the emergence of new versatile and evolving threats and enemies, such as asymmetric groups, swarm drones or hypersonic missile; or the growing number of operation theatres located in more coastal and shallow waters. These researches were performed with a complete study of the ship after several operational performance computations in order to justify the relevance of using ships like the Sea Striker in naval surface operations. For the selected scenarios, the conception process enabled to measure the performance, namely a “Measure of Efficiency” in the NATO framework for 2 different kinds of models: A centralized, classic model, using large and powerful ships; and A distributed model relying on several Sea Strikers. After this stage, a was performed. Lethal, agile, stealth, compact and fitted with a complete set of sensors, the Sea Striker is a new major player in modern warfare and constitutes a very attractive response between the naval unit and the combat helicopter, enabling to reach high operational performances at a reduced cost.

Keywords: surface combatant, compact, hydrofoil, stealth, velocity, lethal

Procedia PDF Downloads 117
57 An Extended Domain-Specific Modeling Language for Marine Observatory Relying on Enterprise Architecture

Authors: Charbel Aoun, Loic Lagadec

Abstract:

A Sensor Network (SN) is considered as an operation of two phases: (1) the observation/measuring, which means the accumulation of the gathered data at each sensor node; (2) transferring the collected data to some processing center (e.g., Fusion Servers) within the SN. Therefore, an underwater sensor network can be defined as a sensor network deployed underwater that monitors underwater activity. The deployed sensors, such as Hydrophones, are responsible for registering underwater activity and transferring it to more advanced components. The process of data exchange between the aforementioned components perfectly defines the Marine Observatory (MO) concept which provides information on ocean state, phenomena and processes. The first step towards the implementation of this concept is defining the environmental constraints and the required tools and components (Marine Cables, Smart Sensors, Data Fusion Server, etc). The logical and physical components that are used in these observatories perform some critical functions such as the localization of underwater moving objects. These functions can be orchestrated with other services (e.g. military or civilian reaction). In this paper, we present an extension to our MO meta-model that is used to generate a design tool (ArchiMO). We propose new constraints to be taken into consideration at design time. We illustrate our proposal with an example from the MO domain. Additionally, we generate the corresponding simulation code using our self-developed domain-specific model compiler. On the one hand, this illustrates our approach in relying on Enterprise Architecture (EA) framework that respects: multiple views, perspectives of stakeholders, and domain specificity. On the other hand, it helps reducing both complexity and time spent in design activity, while preventing from design modeling errors during porting this activity in the MO domain. As conclusion, this work aims to demonstrate that we can improve the design activity of complex system based on the use of MDE technologies and a domain-specific modeling language with the associated tooling. The major improvement is to provide an early validation step via models and simulation approach to consolidate the system design.

Keywords: smart sensors, data fusion, distributed fusion architecture, sensor networks, domain specific modeling language, enterprise architecture, underwater moving object, localization, marine observatory, NS-3, IMS

Procedia PDF Downloads 177