Search results for: network diagnostic tool
8175 Tool Development for Assessing Antineoplastic Drugs Surface Contamination in Healthcare Services and Other Workplaces
Authors: Benoit Atge, Alice Dhersin, Oscar Da Silva Cacao, Beatrice Martinez, Dominique Ducint, Catherine Verdun-Esquer, Isabelle Baldi, Mathieu Molimard, Antoine Villa, Mireille Canal-Raffin
Abstract:
Introduction: Healthcare workers' exposure to antineoplastic drugs (AD) is a burning issue for occupational medicine practitioners. Biological monitoring of occupational exposure (BMOE) is an essential tool for assessing AD contamination of healthcare workers. In addition to BMOE, surface sampling is a useful tool in order to understand how workers get contaminated, to identify sources of environmental contamination, to verify the effectiveness of surface decontamination way and to ensure monitoring of these surfaces. The objective of this work was to develop a complete tool including a kit for surface sampling and a quantification analytical method for AD traces detection. The development was realized with the three following criteria: the kit capacity to sample in every professional environment (healthcare services, veterinaries, etc.), the detection of very low AD traces with a validated analytical method and the easiness of the sampling kit use regardless of the person in charge of sampling. Material and method: AD mostly used in term of quantity and frequency have been identified by an analysis of the literature and consumptions of different hospitals, veterinary services, and home care settings. The kind of adsorbent device, surface moistening solution and mix of solvents for the extraction of AD from the adsorbent device have been tested for a maximal yield. The AD quantification was achieved by an ultra high-performance liquid chromatography method coupled with tandem mass spectrometry (UHPLC-MS/MS). Results: With their high frequencies of use and their good reflect of the diverse activities through healthcare, 15 AD (cyclophosphamide, ifosfamide, doxorubicin, daunorubicin, epirubicin, 5-FU, dacarbazin, etoposide, pemetrexed, vincristine, cytarabine, methothrexate, paclitaxel, gemcitabine, mitomycin C) were selected. The analytical method was optimized and adapted to obtain high sensitivity with very low limits of quantification (25 to 5000ng/mL), equivalent or lowest that those previously published (for 13/15 AD). The sampling kit is easy to use, provided with a didactic support (online video and protocol paper). It showed its effectiveness without inter-individual variation (n=5/person; n= 5 persons; p=0,85; ANOVA) regardless of the person in charge of sampling. Conclusion: This validated tool (sampling kit + analytical method) is very sensitive, easy to use and very didactic in order to control the chemical risk brought by AD. Moreover, BMOE permits a focal prevention. Used in routine, this tool is available for every intervention of occupational health.Keywords: surface contamination, sampling kit, analytical method, sensitivity
Procedia PDF Downloads 1328174 Cardiovascular Modeling Software Tools in Medicine
Authors: J. Fernandez, R. Fernandez de Canete, J. Perea-Paizal, J. C. Ramos-Diaz
Abstract:
The high prevalence of cardiovascular diseases has provoked a raising interest in the development of mathematical models in order to evaluate the cardiovascular function both under physiological and pathological conditions. In this paper, a physical model of the cardiovascular system with intrinsic regulation is presented and implemented by using the object-oriented Modelica simulation software tools. For this task, a multi-compartmental system previously validated with physiological data has been built, based on the interconnection of cardiovascular elements such as resistances, capacitances and pumping among others, by following an electrohydraulic analogy. The results obtained under both physiological and pathological scenarios provide an easy interpretative key to analyze the hemodynamic behavior of the patient. The described approach represents a valuable tool in the teaching of physiology for graduate medical and nursing students among others.Keywords: cardiovascular system, MODELICA simulation software, physical modelling, teaching tool
Procedia PDF Downloads 3008173 Exploring De-Fi through 3 Case Studies: Transparency, Social Impact, and Regulation
Authors: Dhaksha Vivekanandan
Abstract:
DeFi is a network that avoids reliance on financial intermediaries through its peer-to-peer financial network. DeFi operates outside of government control; hence it is important for us to understand its impacts. This study employs a literature review to understand DeFi and its emergence, as well as its implications on transparency, social impact, and regulation. Further, 3 case studies are analysed within the context of these categories. DeFi’s provision of increased transparency poses environmental and storage costs and can lead to user privacy being endangered. DeFi allows for the provision of entrepreneurial incentives and protection against monetary censorship and capital control. Despite DeFi's transparency issues and volatility costs, it has huge potential to reduce poverty; however, regulation surrounding DeFi still requires further tightening by governments.Keywords: DeFi, transparency, regulation, social impact
Procedia PDF Downloads 838172 An Unusual Cause of Electrocardiographic Artefact: Patient's Warming Blanket
Authors: Sanjay Dhiraaj, Puneet Goyal, Aditya Kapoor, Gaurav Misra
Abstract:
In electrocardiography, an ECG artefact is used to indicate something that is not heart-made. Although technological advancements have produced monitors with the potential of providing accurate information and reliable heart rate alarms, despite this, interference of the displayed electrocardiogram still occurs. These interferences can be from the various electrical gadgets present in the operating room or electrical signals from other parts of the body. Artefacts may also occur due to poor electrode contact with the body or due to machine malfunction. Knowing these artefacts is of utmost importance so as to avoid unnecessary and unwarranted diagnostic as well as interventional procedures. We report a case of ECG artefacts occurring due to patient warming blanket and its consequences. A 20-year-old male with a preoperative diagnosis of exstrophy epispadias complex was posted for surgery under epidural and general anaesthesia. Just after endotracheal intubation, we observed nonspecific ECG changes on the monitor. At a first glance, the monitor strip revealed broad QRs complexes suggesting a ventricular bigeminal rhythm. Closer analysis revealed these to be artefacts because although the complexes were looking broad on the first glance there was clear presence of normal sinus complexes which were immediately followed by 'broad complexes' or artefacts produced by some device or connection. These broad complexes were labeled as artefacts as they were originating in the absolute refractory period of the previous normal sinus beat. It would be physiologically impossible for the myocardium to depolarize so rapidly as to produce a second QRS complex. A search for the possible reason for the artefacts was made and after deepening the plane of anaesthesia, ruling out any possible electrolyte abnormalities, checking of ECG leads and its connections, changing monitors, checking all other monitoring connections, checking for proper grounding of anaesthesia machine and OT table, we found that after switching off the patient’s warming apparatus the rhythm returned to a normal sinus one and the 'broad complexes' or artefacts disappeared. As misdiagnosis of ECG artefacts may subject patients to unnecessary diagnostic and therapeutic interventions so a thorough knowledge of the patient and monitors allow for a quick interpretation and resolution of the problem.Keywords: ECG artefacts, patient warming blanket, peri-operative arrhythmias, mobile messaging services
Procedia PDF Downloads 2728171 Artificial Intelligence Based Meme Generation Technology for Engaging Audience in Social Media
Authors: Andrew Kurochkin, Kostiantyn Bokhan
Abstract:
In this study, a new meme dataset of ~650K meme instances was created, a technology of meme generation based on the state of the art deep learning technique - GPT-2 model was researched, a comparative analysis of machine-generated memes and human-created was conducted. We justified that Amazon Mechanical Turk workers can be used for the approximate estimating of users' behavior in a social network, more precisely to measure engagement. It was shown that generated memes cause the same engagement as human memes that produced low engagement in the social network (historically). Thus, generated memes are less engaging than random memes created by humans.Keywords: content generation, computational social science, memes generation, Reddit, social networks, social media interaction
Procedia PDF Downloads 1388170 Efficient Video Compression Technique Using Convolutional Neural Networks and Generative Adversarial Network
Authors: P. Karthick, K. Mahesh
Abstract:
Video has become an increasingly significant component of our digital everyday contact. With the advancement of greater contents and shows of the resolution, its significant volume poses serious obstacles to the objective of receiving, distributing, compressing, and revealing video content of high quality. In this paper, we propose the primary beginning to complete a deep video compression model that jointly upgrades all video compression components. The video compression method involves splitting the video into frames, comparing the images using convolutional neural networks (CNN) to remove duplicates, repeating the single image instead of the duplicate images by recognizing and detecting minute changes using generative adversarial network (GAN) and recorded with long short-term memory (LSTM). Instead of the complete image, the small changes generated using GAN are substituted, which helps in frame level compression. Pixel wise comparison is performed using K-nearest neighbours (KNN) over the frame, clustered with K-means, and singular value decomposition (SVD) is applied for each and every frame in the video for all three color channels [Red, Green, Blue] to decrease the dimension of the utility matrix [R, G, B] by extracting its latent factors. Video frames are packed with parameters with the aid of a codec and converted to video format, and the results are compared with the original video. Repeated experiments on several videos with different sizes, duration, frames per second (FPS), and quality results demonstrate a significant resampling rate. On average, the result produced had approximately a 10% deviation in quality and more than 50% in size when compared with the original video.Keywords: video compression, K-means clustering, convolutional neural network, generative adversarial network, singular value decomposition, pixel visualization, stochastic gradient descent, frame per second extraction, RGB channel extraction, self-detection and deciding system
Procedia PDF Downloads 1878169 High Resolution Image Generation Algorithm for Archaeology Drawings
Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu
Abstract:
Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.Keywords: archaeology drawings, digital heritage, image generation, deep learning
Procedia PDF Downloads 588168 Ectopic Mediastinal Parathyroid Adenoma: A Case Report with Diagnostic and Management Challenges
Authors: Augustina Konadu Larbi-Ampofo, Ekemini Umoinwek
Abstract:
Background: Hypercalcaemia is a common electrolyte imbalance that increases mortality if poorly controlled. Primary hyperparathyroidism often presents like this with a prevalence of 0.1-0.3%. Management due to an ectopic parathyroid adenoma in the mediastinum is challenging, especially in a patient with a pacemaker. Case Presentation: A 79-year-old woman with a history of a previous cardiac arrest, permanent pacemaker, ischaemic heart disease, bilateral renal calculi, rectal polyps, liver cirrhosis, and a family history of hyperthyroidism presented to the emergency department with acute back pain. Management and Outcome: The patient was diagnosed with primary hyperparathyroidism due to her elevated corrected calcium and parathyroid hormone levels. Parathyroid investigations consisting of an NM MIBI scan, SPECT-CT, 4D parathyroid scan, and an ultrasound scan of the neck and thorax confirmed an ectopic parathyroid adenoma in the mediastinum at the level of the aortic arch, along with benign thyroid nodules. The location of the adenoma warranted a thoracoscopic surgical approach; however, the presence of her pacemaker and other cardiovascular conditions predisposed her to a potentially poorer post-operative outcome. Discussion: Mediastinal ectopic parathyroid adenomas are rare and difficult to diagnose and treat, often needing a multimodal imaging approach for accurate localisation. Surgery is a definitive treatment; however, in this patient, long-term medical treatment with cinacalcet was the only next suitable treatment option. The difficulty with this is that cinacalcet tackles the biochemical markers of the disease entity and not the disease itself, leaving room for what happens next if there is refractory/uncontrolled hypercalcaemia in this patient with a pacemaker. Moreover, the coexistence of her multiple conditions raises the suspicion of an underlying multisystemic or multiple endocrine disorder, with multiple endocrine neoplasia coming to mind, necessitating further genetic or autoimmune investigations. Conclusion: Mediastinal ectopic parathyroid adenomas are rare, with diagnostic and management challenges.Keywords: mediastinal ectopic parathyroid adenoma, hyperparathyroidism, SPECT/CT, nuclear medicine, multimodal imaging
Procedia PDF Downloads 168167 Prediction on Housing Price Based on Deep Learning
Authors: Li Yu, Chenlu Jiao, Hongrun Xin, Yan Wang, Kaiyang Wang
Abstract:
In order to study the impact of various factors on the housing price, we propose to build different prediction models based on deep learning to determine the existing data of the real estate in order to more accurately predict the housing price or its changing trend in the future. Considering that the factors which affect the housing price vary widely, the proposed prediction models include two categories. The first one is based on multiple characteristic factors of the real estate. We built Convolution Neural Network (CNN) prediction model and Long Short-Term Memory (LSTM) neural network prediction model based on deep learning, and logical regression model was implemented to make a comparison between these three models. Another prediction model is time series model. Based on deep learning, we proposed an LSTM-1 model purely regard to time series, then implementing and comparing the LSTM model and the Auto-Regressive and Moving Average (ARMA) model. In this paper, comprehensive study of the second-hand housing price in Beijing has been conducted from three aspects: crawling and analyzing, housing price predicting, and the result comparing. Ultimately the best model program was produced, which is of great significance to evaluation and prediction of the housing price in the real estate industry.Keywords: deep learning, convolutional neural network, LSTM, housing prediction
Procedia PDF Downloads 3068166 Using Data from Foursquare Web Service to Represent the Commercial Activity of a City
Authors: Taras Agryzkov, Almudena Nolasco-Cirugeda, Jose L. Oliver, Leticia Serrano-Estrada, Leandro Tortosa, Jose F. Vicent
Abstract:
This paper aims to represent the commercial activity of a city taking as source data the social network Foursquare. The city of Murcia is selected as case study, and the location-based social network Foursquare is the main source of information. After carrying out a reorganisation of the user-generated data extracted from Foursquare, it is possible to graphically display on a map the various city spaces and venues –especially those related to commercial, food and entertainment sector businesses. The obtained visualisation provides information about activity patterns in the city of Murcia according to the people`s interests and preferences and, moreover, interesting facts about certain characteristics of the town itself.Keywords: social networks, spatial analysis, data visualization, geocomputation, Foursquare
Procedia PDF Downloads 4268165 Power Grid Line Ampacity Forecasting Based on a Long-Short-Term Memory Neural Network
Authors: Xiang-Yao Zheng, Jen-Cheng Wang, Joe-Air Jiang
Abstract:
Improving the line ampacity while using existing power grids is an important issue that electricity dispatchers are now facing. Using the information provided by the dynamic thermal rating (DTR) of transmission lines, an overhead power grid can operate safely. However, dispatchers usually lack real-time DTR information. Thus, this study proposes a long-short-term memory (LSTM)-based method, which is one of the neural network models. The LSTM-based method predicts the DTR of lines using the weather data provided by Central Weather Bureau (CWB) of Taiwan. The possible thermal bottlenecks at different locations along the line and the margin of line ampacity can be real-time determined by the proposed LSTM-based prediction method. A case study that targets the 345 kV power grid of TaiPower in Taiwan is utilized to examine the performance of the proposed method. The simulation results show that the proposed method is useful to provide the information for the smart grid application in the future.Keywords: electricity dispatch, line ampacity prediction, dynamic thermal rating, long-short-term memory neural network, smart grid
Procedia PDF Downloads 2828164 Evaluation of Hydrocarbons in Tissues of Bivalve Mollusks from the Red Sea Coast
Authors: Asma Ahmed Aljohani, Mohammed Orif
Abstract:
The concentration of polycyclic aromatic hydrocarbons (PAH) in clam (A. glabrata) was examined in samples collected from Alseef Beach, 30 km south of Jeddah city. Gas chromatography-mass spectrometry (GC-MS) was used to analyse the 14 PAHs. The concentration of total PAHs was found to range from 11.521 to 40.149 ng/gdw with a mean concentration of 21.857 ng/gdw, which is lower compared to similar studies. The lower molecular weight PAHs with three rings comprised 18.14% of the total PAH concentrations in the clams, while the high molecular weight PAHs with four rings, five rings, and six rings account for 81.86%. Diagnostic ratios for PAH source distinction suggested pyrogenic or anthropogenic sources.Keywords: bivalves, biomonitoring, hydrocarbons, PAHs
Procedia PDF Downloads 988163 Application Quality Function Deployment (QFD) Tool in Design of Aero Pumps Based on System Engineering
Authors: Z. Soleymani, M. Amirzadeh
Abstract:
Quality Function Deployment (QFD) was developed in 1960 in Japan and introduced in 1983 in America and Europe. The paper presents a real application of this technique in a way that the method of applying QFD in design and production aero fuel pumps has been considered. While designing a product and in order to apply system engineering process, the first step is identification customer needs then its transition to engineering parameters. Since each change in deign after production process leads to extra human costs and also increase in products quality risk, QFD can make benefits in sale by meeting customer expectations. Since the needs identified as well, the use of QFD tool can lead to increase in communications and less deviation in design and production phases, finally it leads to produce the products with defined technical attributes.Keywords: customer voice, engineering parameters, gear pump, QFD
Procedia PDF Downloads 2498162 An Automatic Speech Recognition Tool for the Filipino Language Using the HTK System
Authors: John Lorenzo Bautista, Yoon-Joong Kim
Abstract:
This paper presents the development of a Filipino speech recognition tool using the HTK System. The system was trained from a subset of the Filipino Speech Corpus developed by the DSP Laboratory of the University of the Philippines-Diliman. The speech corpus was both used in training and testing the system by estimating the parameters for phonetic HMM-based (Hidden-Markov Model) acoustic models. Experiments on different mixture-weights were incorporated in the study. The phoneme-level word-based recognition of a 5-state HMM resulted in an average accuracy rate of 80.13 for a single-Gaussian mixture model, 81.13 after implementing a phoneme-alignment, and 87.19 for the increased Gaussian-mixture weight model. The highest accuracy rate of 88.70% was obtained from a 5-state model with 6 Gaussian mixtures.Keywords: Filipino language, Hidden Markov Model, HTK system, speech recognition
Procedia PDF Downloads 4808161 A Questionnaire-Based Survey: Therapists Response towards Upper Limb Disorder Learning Tool
Authors: Noor Ayuni Che Zakaria, Takashi Komeda, Cheng Yee Low, Kaoru Inoue, Fazah Akhtar Hanapiah
Abstract:
Previous studies have shown that there are arguments regarding the reliability and validity of the Ashworth and Modified Ashworth Scale towards evaluating patients diagnosed with upper limb disorders. These evaluations depended on the raters’ experiences. This initiated us to develop an upper limb disorder part-task trainer that is able to simulate consistent upper limb disorders, such as spasticity and rigidity signs, based on the Modified Ashworth Scale to improve the variability occurring between raters and intra-raters themselves. By providing consistent signs, novice therapists would be able to increase training frequency and exposure towards various levels of signs. A total of 22 physiotherapists and occupational therapists participated in the study. The majority of the therapists agreed that with current therapy education, they still face problems with inter-raters and intra-raters variability (strongly agree 54%; n = 12/22, agree 27%; n = 6/22) in evaluating patients’ conditions. The therapists strongly agreed (72%; n = 16/22) that therapy trainees needed to increase their frequency of training; therefore believe that our initiative to develop an upper limb disorder training tool will help in improving the clinical education field (strongly agree and agree 63%; n = 14/22).Keywords: upper limb disorder, clinical education tool, inter/intra-raters variability, spasticity, modified Ashworth scale
Procedia PDF Downloads 3108160 The Implementation of a Nurse-Driven Palliative Care Trigger Tool
Authors: Sawyer Spurry
Abstract:
Problem: Palliative care providers at an academic medical center in Maryland stated medical intensive care unit (MICU) patients are often referred late in their hospital stay. The MICU has performed well below the hospital quality performance metric of 80% of patients who expire with expected outcomes should have received a palliative care consult within 48 hours of admission. Purpose: The purpose of this quality improvement (QI) project is to increase palliative care utilization in the MICU through the implementation of a Nurse-Driven PalliativeTriggerTool to prompt the need for specialty palliative care consult. Methods: MICU nursing staff and providers received education concerning the implications of underused palliative care services and the literature data supporting the use of nurse-driven palliative care tools as a means of increasing utilization of palliative care. A MICU population specific criteria of palliative triggers (Palliative Care Trigger Tool) was formulated by the QI implementation team, palliative care team, and patient care services department. Nursing staff were asked to assess patients daily for the presence of palliative triggers using the Palliative Care Trigger Tool and present findings during bedside rounds. MICU providers were asked to consult palliative medicinegiven the presence of palliative triggers; following interdisciplinary rounds. Rates of palliative consult, given the presence of triggers, were collected via electronic medical record e-data pull, de-identified, and recorded in the data collection tool. Preliminary Results: Over 140 MICU registered nurses were educated on the palliative trigger initiative along with 8 nurse practitioners, 4 intensivists, 2 pulmonary critical care fellows, and 2 palliative medicine physicians. Over 200 patients were admitted to the MICU and screened for palliative triggers during the 15-week implementation period. Primary outcomes showed an increase in palliative care consult rates to those patients presenting with triggers, a decreased mean time from admission to palliative consult, and increased recognition of unmet palliative care needs by MICU nurses and providers. Conclusions: Anticipatory findings of this QI project would suggest a positive correlation between utilizing palliative care trigger criteria and decreased time to palliative care consult. The direct outcomes of effective palliative care results in decreased length of stay, healthcare costs, and moral distress, as well as improved symptom management and quality of life (QOL).Keywords: palliative care, nursing, quality improvement, trigger tool
Procedia PDF Downloads 1948159 Development of Deep Neural Network-Based Strain Values Prediction Models for Full-Scale Reinforced Concrete Frames Using Highly Flexible Sensing Sheets
Authors: Hui Zhang, Sherif Beskhyroun
Abstract:
Structural Health monitoring systems (SHM) are commonly used to identify and assess structural damage. In terms of damage detection, SHM needs to periodically collect data from sensors placed in the structure as damage-sensitive features. This includes abnormal changes caused by the strain field and abnormal symptoms of the structure, such as damage and deterioration. Currently, deploying sensors on a large scale in a building structure is a challenge. In this study, a highly stretchable strain sensors are used in this study to collect data sets of strain generated on the surface of full-size reinforced concrete (RC) frames under extreme cyclic load application. This sensing sheet can be switched freely between the test bending strain and the axial strain to achieve two different configurations. On this basis, the deep neural network prediction model of the frame beam and frame column is established. The training results show that the method can accurately predict the strain value and has good generalization ability. The two deep neural network prediction models will also be deployed in the SHM system in the future as part of the intelligent strain sensor system.Keywords: strain sensing sheets, deep neural networks, strain measurement, SHM system, RC frames
Procedia PDF Downloads 998158 Circular Tool and Dynamic Approach to Grow the Entrepreneurship of Macroeconomic Metabolism
Authors: Maria Areias, Diogo Simões, Ana Figueiredo, Anishur Rahman, Filipa Figueiredo, João Nunes
Abstract:
It is expected that close to 7 billion people will live in urban areas by 2050. In order to improve the sustainability of the territories and its transition towards circular economy, it’s necessary to understand its metabolism and promote and guide the entrepreneurship answer. The study of a macroeconomic metabolism involves the quantification of the inputs, outputs and storage of energy, water, materials and wastes for an urban region. This quantification and analysis representing one opportunity for the promotion of green entrepreneurship. There are several methods to assess the environmental impacts of an urban territory, such as human and environmental risk assessment (HERA), life cycle assessment (LCA), ecological footprint assessment (EF), material flow analysis (MFA), physical input-output table (PIOT), ecological network analysis (ENA), multicriteria decision analysis (MCDA) among others. However, no consensus exists about which of those assessment methods are best to analyze the sustainability of these complex systems. Taking into account the weaknesses and needs identified, the CiiM - Circular Innovation Inter-Municipality project aims to define an uniform and globally accepted methodology through the integration of various methodologies and dynamic approaches to increase the efficiency of macroeconomic metabolisms and promoting entrepreneurship in a circular economy. The pilot territory considered in CiiM project has a total area of 969,428 ha, comprising a total of 897,256 inhabitants (about 41% of the population of the Center Region). The main economic activities in the pilot territory, which contribute to a gross domestic product of 14.4 billion euros, are: social support activities for the elderly; construction of buildings; road transport of goods, retailing in supermarkets and hypermarkets; mass production of other garments; inpatient health facilities; and the manufacture of other components and accessories for motor vehicles. The region's business network is mostly constituted of micro and small companies (similar to the Central Region of Portugal), with a total of 53,708 companies identified in the CIM Region of Coimbra (39 large companies), 28,146 in the CIM Viseu Dão Lafões (22 large companies) and 24,953 in CIM Beiras and Serra da Estrela (13 large companies). For the construction of the database was taking into account data available at the National Institute of Statistics (INE), General Directorate of Energy and Geology (DGEG), Eurostat, Pordata, Strategy and Planning Office (GEP), Portuguese Environment Agency (APA), Commission for Coordination and Regional Development (CCDR) and Inter-municipal Community (CIM), as well as dedicated databases. In addition to the collection of statistical data, it was necessary to identify and characterize the different stakeholder groups in the pilot territory that are relevant to the different metabolism components under analysis. The CIIM project also adds the potential of a Geographic Information System (GIS) so that it is be possible to obtain geospatial results of the territorial metabolisms (rural and urban) of the pilot region. This platform will be a powerful visualization tool of flows of products/services that occur within the region and will support the stakeholders, improving their circular performance and identifying new business ideas and symbiotic partnerships.Keywords: circular economy tools, life cycle assessment macroeconomic metabolism, multicriteria decision analysis, decision support tools, circular entrepreneurship, industrial and regional symbiosis
Procedia PDF Downloads 1018157 Authentic Connection between the Deity and the Individual Human Being Is Vital for Psychological, Biological, and Social Health
Authors: Sukran Karatas
Abstract:
Authentic energy network interrelations between the Creator and the creations as well as from creations to creations are the most important points for the worlds of physics and metaphysic to unite together and work in harmony, both within human beings, on the other hand, have the ability to choose their own life style voluntarily. However, it includes the automated involuntary spirit, soul and body working systems together with the voluntary actions, which involve personal, cultural and universal, rational or irrational variable values. Therefore, it is necessary for human beings to know the methods of existing authentic energy network connections to be able to communicate correlate and accommodate the physical and metaphysical entities as a proper functioning unity; this is essential for complete human psychological, biological and social well-being. Authentic knowledge is necessary for human beings to verify the position of self within self and with others to regulate conscious and voluntary actions accordingly in order to prevent oppressions and frictions within self and between self and others. Unfortunately, the absence of genuine individual and universal basic knowledge about how to establish an authentic energy network connection within self, with the deity and the environment is the most problematic issue even in the twenty-first century. The second most problematic issue is how to maintain freedom, equality and justice among human beings during these strictly interwoven network connections, which naturally involve physical, metaphysical and behavioral actions of the self and the others. The third and probably the most complicated problem is the scientific identification and the authentication of the deity. This not only provides the whole power and control over the choosers to set their life orders but also to establish perfect physical and metaphysical links as fully coordinated functional energy network. This thus indicates that choosing an authentic deity is the key-point that influences automated, emotional, and behavioral actions altogether, which shapes human perception, personal actions, and life orders. Therefore, we will be considering the existing ‘four types of energy wave end boundary behaviors’, comprising, free end, fixed end boundary behaviors, as well as boundary behaviors from denser medium to less dense medium and from less dense medium to denser medium. Consequently, this article aims to demonstrate that the authentication and the choice of deity has an important effect on individual psychological, biological and social health. It is hoped that it will encourage new researches in the field of authentic energy network connections to establish the best position and the most correct interrelation connections with self and others without violating the authorized orders and the borders of one another to live happier and healthier lives together. In addition, the book ‘Deity and Freedom, Equality, Justice in History, Philosophy, Science’ has more detailed information for those interested in this subject.Keywords: deity, energy network, power, freedom, equality, justice, happiness, sadness, hope, fear, psychology, biology, sociology
Procedia PDF Downloads 3468156 To Ensure Maximum Voter Privacy in E-Voting Using Blockchain, Convolutional Neural Network, and Quantum Key Distribution
Authors: Bhaumik Tyagi, Mandeep Kaur, Kanika Singla
Abstract:
The advancement of blockchain has facilitated scholars to remodel e-voting systems for future generations. Server-side attacks like SQL injection attacks and DOS attacks are the most common attacks nowadays, where malicious codes are injected into the system through user input fields by illicit users, which leads to data leakage in the worst scenarios. Besides, quantum attacks are also there which manipulate the transactional data. In order to deal with all the above-mentioned attacks, integration of blockchain, convolutional neural network (CNN), and Quantum Key Distribution is done in this very research. The utilization of blockchain technology in e-voting applications is not a novel concept. But privacy and security issues are still there in a public and private blockchains. To solve this, the use of a hybrid blockchain is done in this research. This research proposed cryptographic signatures and blockchain algorithms to validate the origin and integrity of the votes. The convolutional neural network (CNN), a normalized version of the multilayer perceptron, is also applied in the system to analyze visual descriptions upon registration in a direction to enhance the privacy of voters and the e-voting system. Quantum Key Distribution is being implemented in order to secure a blockchain-based e-voting system from quantum attacks using quantum algorithms. Implementation of e-voting blockchain D-app and providing a proposed solution for the privacy of voters in e-voting using Blockchain, CNN, and Quantum Key Distribution is done.Keywords: hybrid blockchain, secure e-voting system, convolutional neural networks, quantum key distribution, one-time pad
Procedia PDF Downloads 948155 Financial Intermediation: A Transaction Two-Sided Market Model Approach
Authors: Carlo Gozzelino
Abstract:
Since the early 2000s, the phenomenon of the two-sided markets has been of growing interest in academic literature as such kind of markets differs by having cross-side network effects and same-side network effects characterizing the transactions, which make the analysis different when compared to traditional seller-buyer concept. Due to such externalities, pricing strategies can be based on subsidizing the participation of one side (i.e. considered key for the platform to attract the other side) while recovering the loss on the other side. In recent years, several players of the Italian financial intermediation industry moved from an integrated landscape (i.e. selling their own products) to an open one (i.e. intermediating third party products). According to academic literature such behavior can be interpreted as a merchant move towards a platform, operating in a two-sided market environment. While several application of two-sided market framework are available in academic literature, purpose of this paper is to use a two-sided market concept to suggest a new framework applied to financial intermediation. To this extent, a model is developed to show how competitors behave when vertically integrated and how the peculiarities of a two-sided market act as an incentive to disintegrate. Additionally, we show that when all players act as a platform, the dynamics of a two-sided markets can allow at least a Nash equilibrium to exist, in which platform of different sizes enjoy positive profit. Finally, empirical evidences from Italian market are given to sustain – and to challenge – this interpretation.Keywords: financial intermediation, network externalities, two-sided markets, vertical differentiation
Procedia PDF Downloads 1608154 Code – Switching in a Flipped Classroom for Foreign Students
Authors: E. Tutova, Y. Ebzeeva, L. Gishkaeva, Y.Smirnova, N. Dubinina
Abstract:
We have been working with students from different countries and found it crucial to switch the languages to explain something. Whether it is Russian, or Chinese, explaining in a different language plays an important role for students’ cognitive abilities. In this work we are going to explore how code switching may impact the student’s perception of information. Code-switching is a tool defined by linguists as a switch from one language to another for convenience, explanation of terms unavailable in an initial language or sometimes prestige. In our case, we are going to consider code-switching from the function of convenience. As a rule, students who come to study Russian in a language environment, lack many skills in speaking the language. Thus, it is made harder to explain the rules for them of another language, which is English. That is why switching between English, Russian and Mandarin is crucial for their better understanding. In this work we are going to explore the code-switching as a tool which can help a teacher in a flipped classroom.Keywords: bilingualism, psychological linguistics, code-switching, social linguistics
Procedia PDF Downloads 818153 Consumption and Diffusion Based Model of Tissue Organoid Development
Authors: Elena Petersen, Inna Kornienko, Svetlana Guryeva, Sergey Simakov
Abstract:
In vitro organoid cultivation requires the simultaneous provision of necessary vascularization and nutrients perfusion of cells during organoid development. However, many aspects of this problem are still unsolved. The functionality of vascular network intergrowth is limited during early stages of organoid development since a function of the vascular network initiated on final stages of in vitro organoid cultivation. Therefore, a microchannel network should be created in early stages of organoid cultivation in hydrogel matrix aimed to conduct and maintain minimally required the level of nutrients perfusion for all cells in the expanding organoid. The network configuration should be designed properly in order to exclude hypoxic and necrotic zones in expanding organoid at all stages of its cultivation. In vitro vascularization is currently the main issue within the field of tissue engineering. As perfusion and oxygen transport have direct effects on cell viability and differentiation, researchers are currently limited only to tissues of few millimeters in thickness. These limitations are imposed by mass transfer and are defined by the balance between the metabolic demand of the cellular components in the system and the size of the scaffold. Current approaches include growth factor delivery, channeled scaffolds, perfusion bioreactors, microfluidics, cell co-cultures, cell functionalization, modular assembly, and in vivo systems. These approaches may improve cell viability or generate capillary-like structures within a tissue construct. Thus, there is a fundamental disconnect between defining the metabolic needs of tissue through quantitative measurements of oxygen and nutrient diffusion and the potential ease of integration into host vasculature for future in vivo implantation. A model is proposed for growth prognosis of the organoid perfusion based on joint simulations of general nutrient diffusion, nutrient diffusion to the hydrogel matrix through the contact surfaces and microchannels walls, nutrient consumption by the cells of expanding organoid, including biomatrix contraction during tissue development, which is associated with changed consumption rate of growing organoid cells. The model allows computing effective microchannel network design giving minimally required the level of nutrients concentration in all parts of growing organoid. It can be used for preliminary planning of microchannel network design and simulations of nutrients supply rate depending on the stage of organoid development.Keywords: 3D model, consumption model, diffusion, spheroid, tissue organoid
Procedia PDF Downloads 3088152 Deep Learning Based Text to Image Synthesis for Accurate Facial Composites in Criminal Investigations
Authors: Zhao Gao, Eran Edirisinghe
Abstract:
The production of an accurate sketch of a suspect based on a verbal description obtained from a witness is an essential task for most criminal investigations. The criminal investigation system employs specifically trained professional artists to manually draw a facial image of the suspect according to the descriptions of an eyewitness for subsequent identification. Within the advancement of Deep Learning, Recurrent Neural Networks (RNN) have shown great promise in Natural Language Processing (NLP) tasks. Additionally, Generative Adversarial Networks (GAN) have also proven to be very effective in image generation. In this study, a trained GAN conditioned on textual features such as keywords automatically encoded from a verbal description of a human face using an RNN is used to generate photo-realistic facial images for criminal investigations. The intention of the proposed system is to map corresponding features into text generated from verbal descriptions. With this, it becomes possible to generate many reasonably accurate alternatives to which the witness can use to hopefully identify a suspect from. This reduces subjectivity in decision making both by the eyewitness and the artist while giving an opportunity for the witness to evaluate and reconsider decisions. Furthermore, the proposed approach benefits law enforcement agencies by reducing the time taken to physically draw each potential sketch, thus increasing response times and mitigating potentially malicious human intervention. With publically available 'CelebFaces Attributes Dataset' (CelebA) and additionally providing verbal description as training data, the proposed architecture is able to effectively produce facial structures from given text. Word Embeddings are learnt by applying the RNN architecture in order to perform semantic parsing, the output of which is fed into the GAN for synthesizing photo-realistic images. Rather than the grid search method, a metaheuristic search based on genetic algorithms is applied to evolve the network with the intent of achieving optimal hyperparameters in a fraction the time of a typical brute force approach. With the exception of the ‘CelebA’ training database, further novel test cases are supplied to the network for evaluation. Witness reports detailing criminals from Interpol or other law enforcement agencies are sampled on the network. Using the descriptions provided, samples are generated and compared with the ground truth images of a criminal in order to calculate the similarities. Two factors are used for performance evaluation: The Structural Similarity Index (SSIM) and the Peak Signal-to-Noise Ratio (PSNR). A high percentile output from this performance matrix should attribute to demonstrating the accuracy, in hope of proving that the proposed approach can be an effective tool for law enforcement agencies. The proposed approach to criminal facial image generation has potential to increase the ratio of criminal cases that can be ultimately resolved using eyewitness information gathering.Keywords: RNN, GAN, NLP, facial composition, criminal investigation
Procedia PDF Downloads 1608151 Design of a Tool for Generating Test Cases from BPMN
Authors: Prat Yotyawilai, Taratip Suwannasart
Abstract:
Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.Keywords: software testing, test case, BPMN, flow graph
Procedia PDF Downloads 5558150 Use of Telehealth for Facilitating the Diagnostic Assessment of Autism Spectrum Disorder: A Scoping Review
Authors: Manahil Alfuraydan, Jodie Croxall, Lisa Hurt, Mike Kerr, Sinead Brophy
Abstract:
Autism Spectrum Disorder (ASD) is a developmental condition characterised by impairment in terms of social communication, social interaction, and a repetitive or restricted pattern of interest, behaviour, and activity. There is a significant delay between seeking help and a confirmed diagnosis of ASD. This may result in delay in receiving early intervention services, which are critical for positive outcomes. The long wait times also cause stress for the individuals and their families. Telehealth potentially offers a way of improving the diagnostic pathway for ASD. This review of the literature aims to examine which telehealth approaches have been used in the diagnosis and assessment of autism in children and adults, whether they are feasible and acceptable, and how they compare with face-to-face diagnosis and assessment methods. A comprehensive search of following databases- MEDLINE, CINAHL Plus with Full text, Business Sources Complete, Web of Science, Scopus, PsycINFO and trail and systematic review databases including Cochrane Library, Health Technology Assessment, Database of Abstracts and Reviews of Effectiveness and NHS Economic Evaluation was conducted, combining the terms of autism and telehealth from 2000 to 2018. A total of 10 studies were identified for inclusion in the review. This review of the literature found there to be two methods of using telehealth: (a) video conferencing to enable teams in different areas to consult with the families and to assess the child/adult in real time and (b) a video upload to a web portal that enables the clinical assessment of behaviours in the family home. The findings were positive, finding there to be high agreement in terms of the diagnosis between remote methods and face to face methods and with high levels of satisfaction among the families and clinicians. This field is in the very early stages, and so only studies with small sample size were identified, but the findings suggest that there is potential for telehealth methods to improve assessment and diagnosis of autism used in conjunction with existing methods, especially for those with clear autism traits and adults with autism. Larger randomised controlled trials of this technology are warranted.Keywords: assessment, autism spectrum disorder, diagnosis, telehealth
Procedia PDF Downloads 1288149 The Effects of Mobile Communication on the Nigerian Populace
Authors: Chapman Eze Nnadozie
Abstract:
Communication, the activity of conveying information, remains a vital resource for the growth and development of any given society. Mobile communication, popularly known as global system for mobile communication (GSM) is a globally accepted standard for digital cellular communication. GSM, which is a wireless technology, remains the fastest growing communication means worldwide. Indeed, mobile phones have become a critical business tool and part of everyday life in both developed and developing countries. This study examines the effects of mobile communication on the Nigerian populace. The methodology used in this study is the survey research method with the main data collection tool as questionnaires. The questionnaires were administered to a total of seventy respondents in five cities across the country, namely: Aba, Enugu, Bauchi, Makurdi, and Lagos. The result reveals that though there is some quality of service issues, mobile communication has very significant positive efforts on the economic and social development of the Nigerian populace.Keywords: effect, mobile communication, populace, GSM, wireless technology, mobile phone
Procedia PDF Downloads 2718148 Algorithm and Software Based on Multilayer Perceptron Neural Networks for Estimating Channel Use in the Spectral Decision Stage in Cognitive Radio Networks
Authors: Danilo López, Johana Hernández, Edwin Rivas
Abstract:
The use of the Multilayer Perceptron Neural Networks (MLPNN) technique is presented to estimate the future state of use of a licensed channel by primary users (PUs); this will be useful at the spectral decision stage in cognitive radio networks (CRN) to determine approximately in which time instants of future may secondary users (SUs) opportunistically use the spectral bandwidth to send data through the primary wireless network. To validate the results, sequences of occupancy data of channel were generated by simulation. The results show that the prediction percentage is greater than 60% in some of the tests carried out.Keywords: cognitive radio, neural network, prediction, primary user
Procedia PDF Downloads 3718147 Game of Funds: Efficiency and Policy Implications of the United Kingdom Research Excellence Framework
Authors: Boon Lee
Abstract:
Research publication is an essential output of universities because it not only promotes university recognition, it also receives government funding. The history of university research culture has been one of ‘publish or perish’ and universities have consistently encouraged their academics and researchers to produce research articles in reputable journals in order to maintain a level of competitiveness. In turn, the United Kingdom (UK) government funding is determined by the number and quality of research publications. This paper aims to investigate on whether more government funding leads to more quality papers. To that end, the paper employs a Network DEA model to evaluate the UK higher education performance over a period. Sources of efficiency are also determined via second stage regression analysis.Keywords: efficiency, higher education, network data envelopment analysis, universities
Procedia PDF Downloads 1148146 Green Ports: Innovation Adopters or Innovation Developers
Authors: Marco Ferretti, Marcello Risitano, Maria Cristina Pietronudo, Lina Ozturk
Abstract:
A green port is the result of a sustainable long-term strategy adopted by an entire port infrastructure, therefore by the set of actors involved in port activities. The strategy aims to realise the development of sustainable port infrastructure focused on the reduction of negative environmental impacts without jeopardising economic growth. Green technology represents the core tool to implement sustainable solutions, however, they are not a magic bullet. Ports have always been integrated in the local territory affecting the environment in which they operate, therefore, the sustainable strategy should fit with the entire local systems. Therefore, adopting a sustainable strategy means to know how to involve and engage a wide stakeholders’ network (industries, production, markets, citizens, and public authority). The existing research on the topic has not well integrated this perspective with those of sustainability. Research on green ports have mixed the sustainability aspects with those on the maritime industry, neglecting dynamics that lead to the development of the green port phenomenon. We propose an analysis of green ports adopting the lens of ecosystem studies in the field of management. The ecosystem approach provides a way to model relations that enable green solutions and green practices in a port ecosystem. However, due to the local dimension of a port and the port trend on innovation, i.e., sustainable innovation, we draw to a specific concept of ecosystem, those on local innovation systems. More precisely, we explore if a green port is a local innovation system engaged in developing sustainable innovation with a large impact on the territory or merely an innovation adopter. To address this issue, we adopt a comparative case study selecting two innovative ports in Europe: Rotterdam and Genova. The case study is a research method focused on understanding the dynamics in a specific situation and can be used to provide a description of real circumstances. Preliminary results show two different approaches in supporting sustainable innovation: one represented by Rotterdam, a pioneer in competitiveness and sustainability, and the second one represented by Genoa, an example of technology adopter. The paper intends to provide a better understanding of how sustainable innovations are developed and in which manner a network of port and local stakeholder support this process. Furthermore, it proposes a taxonomy of green ports as developers and adopters of sustainable innovation, suggesting also best practices to model relationships that enable the port ecosystem in applying a sustainable strategy.Keywords: green port, innovation, sustainability, local innovation systems
Procedia PDF Downloads 120