Search results for: physical layer technology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15194

Search results for: physical layer technology

224 Assessing Sustainability of Bike Sharing Projects Using Envision™ Rating System

Authors: Tamar Trop

Abstract:

Bike sharing systems can be important elements of smart cities as they have the potential for impact on multiple levels. These systems can add a significant alternative to other modes of mass transit in cities that are continuously looking for measures to become more livable and maintain their attractiveness for citizens, businesses and tourism. Bike-sharing began in Europe in 1965, and a viable format emerged in the mid-2000s thanks to the introduction of information technology. The rate of growth in bike-sharing schemes and fleets has been very rapid since 2008 and has probably outstripped growth in every other form of urban transport. Today, public bike-sharing systems are available on five continents, including over 700 cities, operating more than 800,000 bicycles at approximately 40,000 docking stations. Since modern bike sharing systems have become prevalent only in the last decade, the existing literature analyzing these systems and their sustainability is relatively new. The purpose of the presented study is to assess the sustainability of these newly emerging transportation systems, by using the Envision™ rating system as a methodological framework and the Israeli 'Tel -O-Fun' – bike sharing project as a case study. The assessment was conducted by project team members. Envision™ is a new guidance and rating system used to assess and improve the sustainability of all types and sizes of infrastructure projects. This tool provides a holistic framework for evaluating and rating the community, environmental, and economic benefits of infrastructure projects over the course of their life cycle. This evaluation method has 60 sustainability criteria divided into five categories: Quality of life, leadership, resource allocation, natural world, and climate and risk. 'Tel -O-Fun' project was launched in Tel Aviv-Yafo on 2011 and today provides about 1,800 bikes for rent, at 180 rental stations across the city. The system is based on a complex computer terminal that is located in the docking stations. The highest-rated sustainable features that the project scored include: (a) Improving quality of life by: offering a low cost and efficient form of public transit, improving community mobility and access, enabling the flexibility of travel within a multimodal transportation system, saving commuters time and money, enhancing public health and reducing air and noise pollution; (b) improving resource allocation by: offering inexpensive and flexible last-mile connectivity, reducing space, materials and energy consumption, reducing wear and tear on public roads, and maximizing the utility of existing infrastructure, and (c) reducing of greenhouse gas emissions from transportation. Overall, 'Tel -O-Fun' project was highly scored as an environmentally sustainable and socially equitable infrastructure. The use of this practical framework for evaluation also yielded various interesting insights on the shortcoming of the system and the characteristics of good solutions. This can contribute to the improvement of the project and may assist planners and operators of bike sharing systems to develop a sustainable, efficient and reliable transportation infrastructure within smart cities.

Keywords: bike sharing, Envision™, sustainability rating system, sustainable infrastructure

Procedia PDF Downloads 340
223 Expanding Entrepreneurial Capabilities through Business Incubators: A Case Study of Idea Hub Nigeria

Authors: Kenechukwu Ikebuaku

Abstract:

Entrepreneurship has long been offered as the panacea for poor economic growth and high rate of unemployment. Business incubation is considered an effective means for enhancing entrepreneurial actitivities while engendering socio-economic development. Information Technology Developers Entrepreneurship Accelerator (iDEA), is a software business incubation programme established by the Nigerian government as a means of boosting digital entrepreneurship activities and reducing unemployment in the country. This study assessed the contribution of iDEA Nigeria’s entrepreneurship programmes towards enhancing the capabilities of its tenants. Using the capability approach and the sustainable livelihoods approach, the study analysed iDEA programmes’ contribution towards the expansion of participants’ entrepreneurial capabilities. Apart from identifying a set of entrepreneurial capabilities from both the literature and empirical analysis, the study went further to ascertain how iDEA incubation has helped to enhance those capabilities for its tenants. It also examined digital entrepreneurship as a valued functioning and as an intermediate functioning leading to other valuable functioning. Furthermore, the study examined gender as a conversion factor in digital entrepreneurship. Both qualitative and quantitative research methods were used for the study, and measurement of key variables was made. While the entire population was utilised to collect data for the quantitative research, purposive sampling was used to select respondents for semi-structured interviews in the qualitative research. However, only 40 beneficiaries agreed to take part in the survey while 10 respondents were interviewed for the study. Responses collected from questionnaires administered were subjected to statistical analysis using SPSS. The study developed indexes to measure the perception of the respondents, on how iDEA programmes have enhanced their entrepreneurial capabilities. The Capabilities Enhancement Perception Index (CEPI) computed indicated that the respondents believed that iDEA programmes enhanced their entrepreneurial capabilities. While access to power supply and reliable internet have the highest positive deviations around mean, negotiation skills and access to customers/clients have the highest negative deviation. These were well supported by the findings of the qualitative analysis in which the participants unequivocally narrated how the resources provided by iDEA aid them in their entrepreneurial endeavours. It was also found that iDEA programmes have a significant effect on the tenants’ access to networking opportunities, both with other emerging entrepreneurs and established entrepreneurs. While assessing gender as a conversion factor, it was discovered that there was very low female participation within the digital entrepreneurship ecosystem. The root cause of this gender disparity was found in unquestioned cultural beliefs and social norms which relegate women to a subservient position and household duties. The findings also showed that many of the entrepreneurs could be considered opportunity-based entrepreneurs rather than necessity entrepreneurs, and that digital entrepreneurship is a valued functioning for iDEA tenants. With regards to challenges facing digital entrepreneurship in Nigeria, infrastructural/institutional inadequacies, lack of funding opportunities, and unfavourable government policies, were considered inimical to entrepreneurial capabilities in the country.

Keywords: entrepreneurial capabilities, unemployment, business incubators, development

Procedia PDF Downloads 236
222 Fort Conger: A Virtual Museum and Virtual Interactive World for Exploring Science in the 19th Century

Authors: Richard Levy, Peter Dawson

Abstract:

Ft. Conger, located in the Canadian Arctic was one of the most remote 19th-century scientific stations. Established in 1881 on Ellesmere Island, a wood framed structure established a permanent base from which to conduct scientific research. Under the charge of Lt. Greely, Ft. Conger was one of 14 expeditions conducted during the First International Polar Year (FIPY). Our research project “From Science to Survival: Using Virtual Exhibits to Communicate the Significance of Polar Heritage Sites in the Canadian Arctic” focused on the creation of a virtual museum website dedicated to one of the most important polar heritage site in the Canadian Arctic. This website was developed under a grant from Virtual Museum of Canada and enables visitors to explore the fort’s site from 1875 to the present, http://fortconger.org. Heritage sites are often viewed as static places. A goal of this project was to present the change that occurred over time as each new group of explorers adapted the site to their needs. The site was first visited by British explorer George Nares in 1875 – 76. Only later did the United States government select this site for the Lady Franklin Bay Expedition (1881-84) with research to be conducted under the FIPY (1882 – 83). Still later Robert Peary and Matthew Henson attempted to reach the North Pole from Ft. Conger in 1899, 1905 and 1908. A central focus of this research is on the virtual reconstruction of the Ft. Conger. In the summer of 2010, a Zoller+Fröhlich Imager 5006i and Minolta Vivid 910 laser scanner were used to scan terrain and artifacts. Once the scanning was completed, the point clouds were registered and edited to form the basis of a virtual reconstruction. A goal of this project has been to allow visitors to step back in time and explore the interior of these buildings with all of its artifacts. Links to text, historic documents, animations, panorama images, computer games and virtual labs provide explanations of how science was conducted during the 19th century. A major feature of this virtual world is the timeline. Visitors to the website can begin to explore the site when George Nares, in his ship the HMS Discovery, appeared in the harbor in 1875. With the emergence of Lt Greely’s expedition in 1881, we can track the progress made in establishing a scientific outpost. Still later in 1901, with Peary’s presence, the site is transformed again, with the huts having been built from materials salvaged from Greely’s main building. Still later in 2010, we can visit the site during its present state of deterioration and learn about the laser scanning technology which was used to document the site. The Science and Survival at Fort Conger project represents one of the first attempts to use virtual worlds to communicate the historical and scientific significance of polar heritage sites where opportunities for first-hand visitor experiences are not possible because of remote location.

Keywords: 3D imaging, multimedia, virtual reality, arctic

Procedia PDF Downloads 420
221 Interferon-Induced Transmembrane Protein-3 rs12252-CC Associated with the Progress of Hepatocellular Carcinoma by Up-Regulating the Expression of Interferon-Induced Transmembrane Protein 3

Authors: Yuli Hou, Jianping Sun, Mengdan Gao, Hui Liu, Ling Qin, Ang Li, Dongfu Li, Yonghong Zhang, Yan Zhao

Abstract:

Background and Aims: Interferon-induced transmembrane protein 3 (IFITM3) is a component of ISG (Interferon-Stimulated Gene) family. IFITM3 has been recognized as a key signal molecule regulating cell growth in some tumors. However, the function of IFITM3 rs12252-CC genotype in the hepatocellular carcinoma (HCC) remains unknown to author’s best knowledge. A cohort study was employed to clarify the relationship between IFITM3 rs12252-CC genotype and HCC progression, and cellular experiments were used to investigate the correlation of function of IFITM3 and the progress of HCC. Methods: 336 candidates were enrolled in study, including 156 with HBV related HCC and 180 with chronic Hepatitis B infections or liver cirrhosis. Polymerase chain reaction (PCR) was employed to determine the gene polymorphism of IFITM3. The functions of IFITM3 were detected in PLC/PRF/5 cell with different treated:LV-IFITM3 transfected with lentivirus to knockdown the expression of IFITM3 and LV-NC transfected with empty lentivirus as negative control. The IFITM3 expression, proliferation and migration were detected by Quantitative reverse transcription polymerase chain reaction (qRT-PCR), QuantiGene Plex 2.0 assay, western blotting, immunohistochemistry, Cell Counting Kit(CCK)-8 and wound healing respectively. Six samples (three infected with empty lentiviral as control; three infected with LV-IFITM3 vector lentiviral as experimental group ) of PLC/PRF/5 were sequenced at BGI (Beijing Genomics Institute, Shenzhen,China) using RNA-seq technology to identify the IFITM3-related signaling pathways and chose PI3K/AKT pathway as related signaling to verify. Results: The patients with HCC had a significantly higher proportion of IFITM3 rs12252-CC compared with the patients with chronic HBV infection or liver cirrhosis. The distribution of CC genotype in HCC patients with low differentiation was significantly higher than that in those with high differentiation. Patients with CC genotype found with bigger tumor size, higher percentage of vascular thrombosis, higher distribution of low differentiation and higher 5-year relapse rate than those with CT/TT genotypes. The expression of IFITM3 was higher in HCC tissues than adjacent normal tissues, and the level of IFITM3 was higher in HCC tissues with low differentiation and metastatic than high/medium differentiation and without metastatic. Higher RNA level of IFITM3 was found in CC genotype than TT genotype. In PLC/PRF/5 cell with knockdown, the ability of cell proliferation and migration was inhibited. Analysis RNA sequencing and verification of RT-PCR found out the phosphatidylinositol 3-kinase/protein kinase B/mammalian target of rapamycin(PI3K/AKT/mTOR) pathway was associated with knockdown IFITM3.With the inhibition of IFITM3, the expression of PI3K/AKT/mTOR signaling pathway was blocked and the expression of vimentin was decreased. Conclusions: IFITM3 rs12252-CC with the higher expression plays a vital role in the progress of HCC by regulating HCC cell proliferation and migration. These effects are associated with PI3K/AKT/mTOR signaling pathway.

Keywords: IFITM3, interferon-induced transmembrane protein 3, HCC, hepatocellular carcinoma, PI3K/ AKT/mTOR, phosphatidylinositol 3-kinase/protein kinase B/mammalian target of rapamycin

Procedia PDF Downloads 124
220 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 150
219 Empirical Study of Innovative Development of Shenzhen Creative Industries Based on Triple Helix Theory

Authors: Yi Wang, Greg Hearn, Terry Flew

Abstract:

In order to understand how cultural innovation occurs, this paper explores the interaction in Shenzhen of China between universities, creative industries, and government in creative economic using the Triple Helix framework. During the past two decades, Triple Helix has been recognized as a new theory of innovation to inform and guide policy-making in national and regional development. Universities and governments around the world, especially in developing countries, have taken actions to strengthen connections with creative industries to develop regional economies. To date research based on the Triple Helix model has focused primarily on Science and Technology collaborations, largely ignoring other fields. Hence, there is an opportunity for work to be done in seeking to better understand how the Triple Helix framework might apply in the field of creative industries and what knowledge might be gleaned from such an undertaking. Since the late 1990s, the concept of ‘creative industries’ has been introduced as policy and academic discourse. The development of creative industries policy by city agencies has improved city wealth creation and economic capital. It claims to generate a ‘new economy’ of enterprise dynamics and activities for urban renewal through the arts and digital media, via knowledge transfer in knowledge-based economies. Creative industries also involve commercial inputs to the creative economy, to dynamically reshape the city into an innovative culture. In particular, this paper will concentrate on creative spaces (incubators, digital tech parks, maker spaces, art hubs) where academic, industry and government interact. China has sought to enhance the brand of their manufacturing industry in cultural policy. It aims to transfer the image of ‘Made in China’ to ‘Created in China’ as well as to give Chinese brands more international competitiveness in a global economy. Shenzhen is a notable example in China as an international knowledge-based city following this path. In 2009, the Shenzhen Municipal Government proposed the city slogan ‘Build a Leading Cultural City”’ to show the ambition of government’s strong will to develop Shenzhen’s cultural capacity and creativity. The vision of Shenzhen is to become a cultural innovation center, a regional cultural center and an international cultural city. However, there has been a lack of attention to the triple helix interactions in the creative industries in China. In particular, there is limited knowledge about how interactions in creative spaces co-location within triple helix networks significantly influence city based innovation. That is, the roles of participating institutions need to be better understood. Thus, this paper discusses the interplay between university, creative industries and government in Shenzhen. Secondary analysis and documentary analysis will be used as methods in an effort to practically ground and illustrate this theoretical framework. Furthermore, this paper explores how are creative spaces being used to implement Triple Helix in creative industries. In particular, the new combination of resources generated from the synthesized consolidation and interactions through the institutions. This study will thus provide an innovative lens to understand the components, relationships and functions that exist within creative spaces by applying Triple Helix framework to the creative industries.

Keywords: cultural policy, creative industries, creative city, triple Helix

Procedia PDF Downloads 206
218 All-In-One Universal Cartridge Based Truly Modular Electrolyte Analyzer

Authors: S. Dalvi, N. Sane, V. Patil, D. Bansode, A. Tharakan, V. Mathur

Abstract:

Measurement of routine clinical electrolyte tests is common in labs worldwide for screening of illness or diseases. All the analyzers for the measurement of electrolyte parameters have sensors, reagents, sampler, pump tubing, valve, other tubing’s separate that are either expensive, require heavy maintenance and have a short shelf-life. Moreover, the costs required to maintain such Lab instrumentation is high and this limits the use of the device to only highly specialized personnel and sophisticated labs. In order to provide Healthcare Diagnostics to ALL at affordable costs, there is a need for an All-in-one Universal Modular Cartridge that contains sensors, reagents, sampler, valve, pump tubing, and other tubing’s in one single integrated module-in-module cartridge that is affordable, reliable, easy-to-use, requires very low sample volume and is truly modular and maintenance-free. DiaSys India has developed a World’s first, Patent Pending, Versatile All-in-one Universal Module-in-Module Cartridge based Electrolyte Analyzer (QDx InstaLyte) that can perform sodium, potassium, chloride, calcium, pH, lithium tests. QDx InstaLyte incorporates High Performance, Inexpensive All-in-one Universal Cartridge for rapid quantitative measurement of electrolytes in body fluids. Our proposed methodology utilizes Advanced & Improved long life ISE sensors to provide a sensitive and accurate result in 120 sec with just 100 µl of sample volume. The All-in-One Universal Cartridge has a very low reagent consumption capable of maximum of 1000 tests with a Use-life of 3-4 months and a long Shelf life of 12-18 months at 4-25°C making it very cost-effective. Methods: QDx InstaLyte analyzers with All-in-one Universal Modular Cartridges were independently evaluated with three R&D lots for Method Performance (Linearity, Precision, Method Comparison, Cartridge Stability) to measure Sodium, Potassium, Chloride. Method Comparison was done against Medica EasyLyte Plus Na/K/Cl Electrolyte Analyzer, a mid-size lab based clinical chemistry analyzer with N = 100 samples run over 10 days. Within-run precision study was done using modified CLSI guidelines with N = 20 samples and day-to-day precision study was done for 7 consecutive days using Trulab N & P Quality Control Samples. Accelerated stability testing was done at 45oC for 4 weeks with Production Lots. Results: Data analysis indicates that the CV for within-run precision for Na is ≤ 1%, for K is ≤2%, and for Cl is ≤2% and with R2 ≥ 0.95 for Method Comparison. Further, the All-in-One Universal Cartridge is stable up to 12-18 months at 4-25oC storage temperature based on preliminary extrapolated data. Conclusion: The Developed Technology Platform of All-in-One Universal Module-in-Module Cartridge based QDx InstaLyte is Reliable and meets all the performance specifications of the lab and is Truly Modular and Maintenance-Free. Hence, it can be easily adapted for low cost, sensitive and rapid measurement of electrolyte tests in low resource settings such as in urban, semi-urban and rural areas in the developing countries and can be used as a Point-of-care testing system for worldwide applications.

Keywords: all-in-one modular catridge, electrolytes, maintenance free, QDx instalyte

Procedia PDF Downloads 31
217 Statistical Models and Time Series Forecasting on Crime Data in Nepal

Authors: Dila Ram Bhandari

Abstract:

Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.

Keywords: time series analysis, forecasting, ARIMA, machine learning

Procedia PDF Downloads 164
216 Stability of Porous SiC Based Materials under Relevant Conditions of Radiation and Temperature

Authors: Marta Malo, Carlota Soto, Carmen García-Rosales, Teresa Hernández

Abstract:

SiC based composites are candidates for possible use as structural and functional materials in the future fusion reactors, the main role is intended for the blanket modules. In the blanket, the neutrons produced in the fusion reaction slow down and their energy is transformed into heat in order to finally generate electrical power. In the blanket design named Dual Coolant Lead Lithium (DCLL), a PbLi alloy for power conversion and tritium breeding circulates inside hollow channels called Flow Channel Inserts (FCIs). These FCI must protect the steel structures against the highly corrosive PbLi liquid and the high temperatures, but also provide electrical insulation in order to minimize magnetohydrodynamic interactions of the flowing liquid metal with the high magnetic field present in a magnetically confined fusion environment. Due to their nominally high temperature and radiation stability as well as corrosion resistance, SiC is the main choice for the flow channel inserts. The significantly lower manufacturing cost presents porous SiC (dense coating is required in order to assure protection against corrosion and as a tritium barrier) as a firm alternative to SiC/SiC composites for this purpose. This application requires the materials to be exposed to high radiation levels and extreme temperatures, conditions for which previous studies have shown noticeable changes in both the microstructure and the electrical properties of different types of silicon carbide. Both initial properties and radiation/temperature induced damage strongly depend on the crystal structure, polytype, impurities/additives that are determined by the fabrication process, so the development of a suitable material requires full control of these variables. For this work, several SiC samples with different percentage of porosity and sintering additives have been manufactured by the so-called sacrificial template method at the Ceit-IK4 Technology Center (San Sebastián, Spain), and characterized at Ciemat (Madrid, Spain). Electrical conductivity was measured as a function of temperature before and after irradiation with 1.8 MeV electrons in the Ciemat HVEC Van de Graaff accelerator up to 140 MGy (~ 2·10 -5 dpa). Radiation-induced conductivity (RIC) was also examined during irradiation at 550 ºC for different dose rates (from 0.5 to 5 kGy/s). Although no significant RIC was found in general for any of the samples, electrical conductivity increase with irradiation dose was observed to occur for some compositions with a linear tendency. However, first results indicate enhanced radiation resistance for coated samples. Preliminary thermogravimetric tests of selected samples, together with posterior XRD analysis allowed interpret radiation-induced modification of the electrical conductivity in terms of changes in the SiC crystalline structure. Further analysis is needed in order to confirm this.

Keywords: DCLL blanket, electrical conductivity, flow channel insert, porous SiC, radiation damage, thermal stability

Procedia PDF Downloads 201
215 The Real Ambassador: How Hip Hop Culture Connects and Educates across Borders

Authors: Frederick Gooding

Abstract:

This paper explores how many Hip Hop artists have intentionally and strategically invoked sustainability principles of people, planet and profits as a means to create community, compensate for and cope with structural inequalities in society. These themes not only create community within one's country, but the powerful display and demonstration of these narratives create community on a global plane. Listeners of Hip Hop are therefore able to learn about the political events occurring in another country free of censure, and establish solidarity worldwide. Hip Hop therefore can be an ingenious tool to create self-worth, recycle positive imagery, and serve as a defense mechanism from institutional and structural forces that conspire to make an upward economic and social trajectory difficult, if not impossible for many people of color, all across the world. Although the birthplace of Hip Hop, the United States of America, is still predominately White, it has undoubtedly grown more diverse at a breath-­taking pace in recent decades. Yet, whether American mainstream media will fully reflect America’s newfound diversity remains to be seen. As it stands, American mainstream media is seen and enjoyed by diverse audiences not just in America, but all over the world. Thus, it is imperative that further inquiry is conducted about one of the fastest growing genres within one of the world’s largest and most influential media industries generating upwards of $10 billion annually. More importantly, hip hop, its music and associated culture collectively represent a shared social experience of significant value. They are important tools used both to inform and influence economic, social and political identity. Conversely, principles of American exceptionalism often prioritize American political issues over those of others, thereby rendering a myopic political view within the mainstream. This paper will therefore engage in an international contextualization of the global phenomena entitled Hip Hop by exploring the creative genius and marketing appeal of Hip Hop within the global context of information technology, political expression and social change in addition to taking a critical look at historically racialized imagery within mainstream media. Many artists the world over have been able to freely express themselves and connect with broader communities outside of their own borders, all through the sound practice of the craft of Hip Hop. An empirical understanding of political, social and economic forces within the United States will serve as a bridge for identifying and analyzing transnational themes of commonality for typically marginalized or disaffected communities facing similar struggles for survival and respect. The sharing of commonalities of marginalized cultures not only serves as a source of education outside of typically myopic, mainstream sources, but it also creates transnational bonds globally to the extent that practicing artists resonate with many of the original themes of (now mostly underground) Hip Hop as with many of the African American artists responsible for creating and fostering Hip Hop's powerful outlet of expression. Hip Hop's power of connectivity and culture-sharing transnationally across borders provides a key source of education to be taken seriously by academics.

Keywords: culture, education, global, hip hop, mainstream music, transnational

Procedia PDF Downloads 101
214 Formation of Science Literations Based on Indigenous Science Mbaru Niang Manggarai

Authors: Yuliana Wahyu, Ambros Leonangung Edu

Abstract:

The learning praxis that is proposed by 2013 Curriculum (K-13) is no longer school-oriented as a supply-driven, but now a demand-driven provider. This vision is connected with Jokowi-Kalla Nawacita program to create a competitive nation in the global era. Competition is a social fact that must be faced. Therefore the curriculum will design a process to be the innovators and entrepreneurs.To get this goal, K-13 implements the character education. This aims at creating the innovators and entrepreneurs from an early age (primary school). One part of strengthening it is literacy formations (reading, numeracy, science, ICT, finance, and culture). Thus, science literacy is an integral part of character education. The above outputs are only formed through the innovative process through intra-curricular (blended learning), co-curriculer (hands-on learning) and extra-curricular (personalized learning). Unlike the curriculums before that child cram with the theories dominating the intellectual process, new breakthroughs make natural, social, and cultural phenomena as learning sources. For example, Science in primary schoolsplaceBiology as the platform. And Science places natural, social, and cultural phenomena as a learning field so that students can learn, discover, solve concrete problems, and the prospects of development and application in their everyday lives. Science education not only learns about facts collection or natural phenomena but also methods and scientific attitudes. In turn, Science will form the science literacy. Science literacy have critical, creative, logical, and initiative competences in responding to the issues of culture, science and technology. This is linked with science nature which includes hands-on and minds-on. To sustain the effectiveness of science learning, K-13 opens a new way of viewing a contextual learning model in which facts or natural phenomena are drawn closer to the child's learning environment to be studied and analyzed scientifically. Thus, the topic of elementary science discussion is the practical and contextual things that students encounter. This research is about to contextualize Science in primary schools at Manggarai, NTT, by placing local wisdom as a learning source and media to form the science literacy. Explicitly, this study discovers the concept of science and mathematics in Mbaru Niang. Mbaru Niang is a forgotten potentials of the centralistic-theoretical mainstream curriculum so far. In fact, the traditional Manggarai community stores and inherits much of the science-mathematical indigenous sciences. In the traditional house structures are full of science and mathematics knowledge. Every details have style, sound and mathematical symbols. Learning this, students are able to collaborate and synergize the content and learning resources in student learning activities. This is constructivist contextual learning that will be applied in meaningful learning. Meaningful learning allows students to learn by doing. Students then connect topics to the context, and science literacy is constructed from their factual experiences. The research location will be conducted in Manggarai through observation, interview, and literature study.

Keywords: indigenous science, Mbaru Niang, science literacy, science

Procedia PDF Downloads 209
213 The Use of Telecare in the Re-design of Overnight Supports for People with Learning Disabilities: Implementing a Cluster-based Approach in North Ayrshire

Authors: Carly Nesvat, Dominic Jarrett, Colin Thomson, Wilma Coltart, Thelma Bowers, Jan Thomson

Abstract:

Introduction: Within Scotland, the Same As You strategy committed to moving people with learning disabilities out of long-stay hospital accommodation into homes in the community. Much of the focus of this movement was on the placement of people within individual homes. In order to achieve this, potentially excessive supports were put in place which created dependence, and carried significant ongoing cost primarily for local authorities. The greater focus on empowerment and community participation which has been evident in more recent learning disability strategy, along with the financial pressures being experienced across the public sector, created an imperative to re-examine that provision, particularly in relation to the use of expensive sleepover supports to individuals, and the potential for this to be appropriately scaled back through the use of telecare. Method: As part of a broader programme of redesigning overnight supports within North Ayrshire, a cluster of individuals living in close proximity were identified, who were in receipt of overnight supports, but who were identified as having the capacity to potentially benefit from their removal. In their place, a responder service was established (an individual staying overnight in a nearby service user’s home), and a variety of telecare solutions were placed within individual’s homes. Active and passive technology was connected to an Alarm Receiving Centre, which would alert the local responder service when necessary. Individuals and their families were prepared for the change, and continued to be informed about progress with the pilot. Results: 4 individuals, 2 of whom shared a tenancy, had their sleepover supports removed as part of the pilot. Extensive data collection in relation to alarm activation was combined with feedback from the 4 individuals, their families, and staff involved in their support. Varying perspectives emerged within the feedback. 3 of the individuals were clearly described as benefitting from the change, and the greater sense of independence it brought, while more concerns were evident in relation to the fourth. Some family members expressed a need for greater preparation in relation to the change and ongoing information provision. Some support staff also expressed a need for more information, to help them understand the new support arrangements for an individual, as well as noting concerns in relation to the outcomes for one participant. Conclusion: Developing a telecare response in relation to a cluster of individuals was facilitated by them all being supported by the same care provider. The number of similar clusters of individuals being identified within North Ayrshire is limited. Developing other solutions such as a response service for redesign will potentially require greater collaboration between different providers of home support, as well as continuing to explore the full range of telecare, including digital options. The pilot has highlighted the need for effective preparatory and ongoing engagement with staff and families, as well as the challenges which can accompany making changes to long-standing packages of support.

Keywords: challenges, change, engagement, telecare

Procedia PDF Downloads 177
212 Phycoremiadation of Heavy Metals by Marine Macroalgae Collected from Olaikuda, Rameswaram, Southeast Coast of India

Authors: Suparna Roy, Anatharaman Perumal

Abstract:

The industrial effluent with high amount of heavy metals is known to have adverse effects on the environment. For the removal of heavy metals from aqueous environment, different conventional treatment technologies had been applied gradually which are not economically beneficial and also produce huge quantity of toxic chemical sludge. So, bio-sorption of heavy metals by marine plant is an eco-friendly innovative and alternative technology for removal of these pollutants from aqueous environment. The aim of this study is to evaluate the capacity of heavy metals accumulation and removal by some selected marine macroalgae (seaweeds) from marine environment. Methods: Seaweeds Acanthophora spicifera (Vahl.) Boergesen, Codium tomentosum Stackhouse, Halimeda gracilis Harvey ex. J. Agardh, Gracilaria opuntia Durairatnam.nom. inval. Valoniopsis pachynema (Martens) Boergesen, Caulerpa racemosa var. macrophysa (Sonder ex Kutzing) W. R. Taylor and Hydroclathrus clathratus (C. Agardh) Howe were collected from Olaikuda (09°17.526'N-079°19.662'E), Rameshwaram, south east coast of India during post monsoon period (April’2016). Seaweeds were washed with sterilized and filtered in-situ seawater repeatedly to remove all the epiphytes and debris and clean seaweeds were kept for shade drying for one week. The dried seaweeds were grinded to powder, and one gm powder seaweeds were taken in a 250ml conical flask, and 8 ml of 10 % HNO3 (70 % pure) was added to each sample and kept in room temperature (28 ̊C) for 24 hours and then samples were heated in hotplate at 120 ̊C, boiled to evaporate up to dryness and 20 ml of Nitric acid: Percholoric acid in 4:1 were added to it and again heated to hotplate at 90 ̊C up to evaporate to dryness, then samples were kept in room temperature for few minutes to cool and 10ml 10 % HNO3 were added to it and kept for 24 hours in cool and dark place and filtered with Whatman (589/2) filter paper and the filtrates were collected in 250ml clean conical flask and diluted accurately to 25 ml volume with double deionised water and triplicate of each sample were analysed with Inductively-Coupled plasma analysis (ICP-OES) to analyse total eleven heavy metals (Ag, Cd, B, Cu, Mn, Co, Ni, Cr, Pb, Zn, and Al content of the specified species and data were statistically evaluated for standard deviation. Results: Acanthophora spicifera contains highest amount of Ag (0.1± 0.2 mg/mg) followed by Cu (0.16±0.01 mg/mg), Mn (1.86±0.02 mg/mg), B (3.59±0.2 mg/mg), Halimeda gracilis showed highest accumulation of Al (384.75±0.12mg/mg), Valoniopsis pachynema accumulates maximum amount of Co (0.12±0.01 mg/mg), Zn (0.64±0.02 mg/mg), Caulerpa racemosa var. macrophysa contains Zn (0.63±0.01), Cr (0.26±0.01 mg/mg ), Ni (0.21±0.05), Pb (0.16±0.03 ) and Cd ( 0.02±00 ). Hydroclathrus clathratus, Codium tomentosum and Gracilaria opuntia also contain adequate amount of heavy metals. Conclusions: The mentioned species of seaweeds are contributing important role for decreasing the heavy metals pollution in marine environment by bioaccumulation. So, we can utilise this species to remove excess amount of heavy metals from polluted area.

Keywords: heavy metals pollution, seaweeds, bioaccumulation, eco-friendly, phyco-remediation

Procedia PDF Downloads 235
211 Regional Barriers and Opportunities for Developing Innovation Networks in the New Media Industry: A Comparison between Beijing and Bangalore Regional Innovation Systems

Authors: Cristina Chaminade, Mandar Kulkarni, Balaji Parthasarathy, Monica Plechero

Abstract:

The characteristics of a regional innovation system (RIS) and the specificity of the knowledge base of an industry may contribute to create peculiar paths for innovation and development of firms’ geographic extended innovation networks. However, the relative empirical evidence in emerging economies remains underexplored. The paper aims to fill the research gap by means of some recent qualitative research conducted in 2016 in Beijing (China) and Bangalore (India). It analyzes cases studies of firms in the new media industry, a sector that merges different IT competences with competences from other knowledge domains and that is emerging in those RIS. The results show that while in Beijing the new media sector results to be more in line with the existing institutional setting and governmental goals aimed at targeting specific social aspects and social problems of the population, in Bangalore it remains a more spontaneous firms-led process. In Beijing what matters for the development of innovation networks is the governmental setting and the national and regional strategies to promote science and technology in this sector, internet and mass innovation. The peculiarities of recent governmental policies aligned to the domestic goals may provide good possibilities for start-ups to develop innovation networks. However, due to the specificities of those policies targeting the Chinese market, networking outside the domestic market are not so promoted. Moreover, while some institutional peculiarities, such as a culture of collaboration in the region, may be favorable for local networking, regulations related to Internet censorship may limit the use of global networks particularly when based on virtual spaces. Mainly firms with already some foreign experiences and contact take advantage of global networks. In Bangalore, the role of government in pushing networking for the new media industry at the present stage is quite absent at all geographical levels. Indeed there is no particular strategic planning or prioritizing in the region toward the new media industry, albeit one industrial organization has emerged to represent the animation industry interests. This results in a lack of initiatives for sustaining the integration of complementary knowledge into the local portfolio of IT specialization. Firms actually involved in the new media industry face institutional constrains related to a poor level of local trust and cooperation, something that does not allow for full exploitation of local linkages. Moreover, knowledge-provider organizations in Bangalore remain still a solid base for the IT domain, but not for other domains. Initiatives to link to international networks seem therefore more the result of individual entrepreneurial actions aimed at acquiring complementary knowledge and competencies from different domains and exploiting potentiality in different markets. From those cases, it emerges that role of government, soft institutions and organizations in the two RIS differ substantially in the creation of barriers and opportunities for the development of innovation networks and their specific aim.

Keywords: regional innovation system, emerging economies, innovation network, institutions, organizations, Bangalore, Beijing

Procedia PDF Downloads 323
210 Case Study on Innovative Aquatic-Based Bioeconomy for Chlorella sorokiniana

Authors: Iryna Atamaniuk, Hannah Boysen, Nils Wieczorek, Natalia Politaeva, Iuliia Bazarnova, Kerstin Kuchta

Abstract:

Over the last decade due to climate change and a strategy of natural resources preservation, the interest for the aquatic biomass has dramatically increased. Along with mitigation of the environmental pressure and connection of waste streams (including CO2 and heat emissions), microalgae bioeconomy can supply food, feed, as well as the pharmaceutical and power industry with number of value-added products. Furthermore, in comparison to conventional biomass, microalgae can be cultivated in wide range of conditions without compromising food and feed production, thus addressing issues associated with negative social and the environmental impacts. This paper presents the state-of-the art technology for microalgae bioeconomy from cultivation process to production of valuable components and by-streams. Microalgae Chlorella sorokiniana were cultivated in the pilot-scale innovation concept in Hamburg (Germany) using different systems such as race way pond (5000 L) and flat panel reactors (8 x 180 L). In order to achieve the optimum growth conditions along with suitable cellular composition for the further extraction of the value-added components, process parameters such as light intensity, temperature and pH are continuously being monitored. On the other hand, metabolic needs in nutrients were provided by addition of micro- and macro-nutrients into a medium to ensure autotrophic growth conditions of microalgae. The cultivation was further followed by downstream process and extraction of lipids, proteins and saccharides. Lipids extraction is conducted in repeated-batch semi-automatic mode using hot extraction method according to Randall. As solvents hexane and ethanol are used at different ratio of 9:1 and 1:9, respectively. Depending on cell disruption method along with solvents ratio, the total lipids content showed significant variations between 8.1% and 13.9 %. The highest percentage of extracted biomass was reached with a sample pretreated with microwave digestion using 90% of hexane and 10% of ethanol as solvents. Proteins content in microalgae was determined by two different methods, namely: Total Kejadahl Nitrogen (TKN), which further was converted to protein content, as well as Bradford method using Brilliant Blue G-250 dye. Obtained results, showed a good correlation between both methods with protein content being in the range of 39.8–47.1%. Characterization of neutral and acid saccharides from microalgae was conducted by phenol-sulfuric acid method at two wavelengths of 480 nm and 490 nm. The average concentration of neutral and acid saccharides under the optimal cultivation conditions was 19.5% and 26.1%, respectively. Subsequently, biomass residues are used as substrate for anaerobic digestion on the laboratory-scale. The methane concentration, which was measured on the daily bases, showed some variations for different samples after extraction steps but was in the range between 48% and 55%. CO2 which is formed during the fermentation process and after the combustion in the Combined Heat and Power unit can potentially be used within the cultivation process as a carbon source for the photoautotrophic synthesis of biomass.

Keywords: bioeconomy, lipids, microalgae, proteins, saccharides

Procedia PDF Downloads 245
209 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms

Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee

Abstract:

Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.

Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences

Procedia PDF Downloads 274
208 On the Limits of Board Diversity: Impact of Network Effect on Director Appointments

Authors: Vijay Marisetty, Poonam Singh

Abstract:

Research on the effect of director's network connections on investor welfare is inconclusive. Some studies suggest that directors' connections are beneficial, in terms of, improving earnings information, firms valuation for new investors. On the other hand, adverse effects of directorial networks are also reported, in terms of higher earnings management, options back dating fraud, reduction in firm performance, lower board monitoring. From regulatory perspective, the role of directorial networks on corporate welfare is crucial. Cognizant of the possible ill effects associated with directorial networks, large investors, for better representation on the boards, are building their own database of prospective directors who are highly qualified, however, sourced from outside the highly connected directorial labor market. For instance, following Dodd-Frank Reform Act, California Public Employees' Retirement Systems (CalPERs) has initiated a database for registering aspiring and highly qualified directors to nominate them for board seats (proxy access). Our paper stems from this background and tries to explore the chances of outside directors getting directorships who lack established network connections. The paper is able to identify such aspiring directors' information by accessing a unique Indian data sourced from an online portal that aims to match the supply of registered aspirants with the growing demand for outside directors in India. The online portal's tie-up with stock exchanges ensures firms to access the new pool of directors. Such direct access to the background details of aspiring directors over a period of 10 years, allows us to examine the chances of aspiring directors without corporate network, to enter directorial network. Using this resume data of 16105 aspiring corporate directors in India, who have no prior board experience in the directorial labor market, the paper analyses the entry dynamics in corporate directors' labor market. The database also allows us to investigate the value of corporate network by comparing non-network new entrants with incumbent networked directors. The study develops measures of network centrality and network degree based on merit, i.e. network of individuals belonging to elite educational institutions, like Indian Institute of Management (IIM) or Indian Institute of Technology (IIT) and based on job or company, i.e. network of individuals serving in the same company. The paper then measures the impact of these networks on the appointment of first time directors and subsequent appointment of directors. The paper reports the following main results: 1. The likelihood of becoming a corporate director, without corporate network strength, is only 1 out 100 aspirants. This is inspite of comparable educational background and similar duration of corporate experience; 2. Aspiring non-network directors' elite educational ties help them to secure directorships. However, for post-board appointments, their newly acquired corporate network strength overtakes as their main determinant for subsequent board appointments and compensation. The results thus highlight the limitations in increasing board diversity.

Keywords: aspiring corporate directors, board diversity, director labor market, director networks

Procedia PDF Downloads 312
207 Determination of Slope of Hilly Terrain by Using Proposed Method of Resolution of Forces

Authors: Reshma Raskar-Phule, Makarand Landge, Saurabh Singh, Vijay Singh, Jash Saparia, Shivam Tripathi

Abstract:

For any construction project, slope calculations are necessary in order to evaluate constructability on the site, such as the slope of parking lots, sidewalks, and ramps, the slope of sanitary sewer lines, slope of roads and highways. When slopes and grades are to be determined, designers are concerned with establishing proper slopes and grades for their projects to assess cut and fill volume calculations and determine inverts of pipes. There are several established instruments commonly used to determine slopes, such as Dumpy level, Abney level or Hand Level, Inclinometer, Tacheometer, Henry method, etc., and surveyors are very familiar with the use of these instruments to calculate slopes. However, they have some other drawbacks which cannot be neglected while major surveying works. Firstly, it requires expert surveyors and skilled staff. The accessibility, visibility, and accommodation to remote hilly terrain with these instruments and surveying teams are difficult. Also, determination of gentle slopes in case of road and sewer drainage constructions in congested urban places with these instruments is not easy. This paper aims to develop a method that requires minimum field work, minimum instruments, no high-end technology or instruments or software, and low cost. It requires basic and handy surveying accessories like a plane table with a fixed weighing machine, standard weights, alidade, tripod, and ranging rods should be able to determine the terrain slope in congested areas as well as in remote hilly terrain. Also, being simple and easy to understand and perform the people of that local rural area can be easily trained for the proposed method. The idea for the proposed method is based on the principle of resolution of weight components. When any object of standard weight ‘W’ is placed on an inclined surface with a weighing machine below it, then its cosine component of weight is presently measured by that weighing machine. The slope can be determined from the relation between the true or actual weight and the apparent weight. A proper procedure is to be followed, which includes site location, centering and sighting work, fixing the whole set at the identified station, and finally taking the readings. A set of experiments for slope determination, mild and moderate slopes, are carried out by the proposed method and by the theodolite instrument in a controlled environment, on the college campus, and uncontrolled environment actual site. The slopes determined by the proposed method were compared with those determined by the established instruments. For example, it was observed that for the same distances for mild slope, the difference in the slope obtained by the proposed method and by the established method ranges from 4’ for a distance of 8m to 2o15’20” for a distance of 16m for an uncontrolled environment. Thus, for mild slopes, the proposed method is suitable for a distance of 8m to 10m. The correlation between the proposed method and the established method shows a good correlation of 0.91 to 0.99 for various combinations, mild and moderate slope, with the controlled and uncontrolled environment.

Keywords: surveying, plane table, weight component, slope determination, hilly terrain, construction

Procedia PDF Downloads 96
206 Physiological Effects on Scientist Astronaut Candidates: Hypobaric Training Assessment

Authors: Pedro Llanos, Diego García

Abstract:

This paper is addressed to expanding our understanding of the effects of hypoxia training on our bodies to better model its dynamics and leverage some of its implications and effects on human health. Hypoxia training is a recommended practice for military and civilian pilots that allow them to recognize their early hypoxia signs and symptoms, and Scientist Astronaut Candidates (SACs) who underwent hypobaric hypoxia (HH) exposure as part of a training activity for prospective suborbital flight applications. This observational-analytical study describes physiologic responses and symptoms experienced by a SAC group before, during and after HH exposure and proposes a model for assessing predicted versus observed physiological responses. A group of individuals with diverse Science Technology Engineering Mathematics (STEM) backgrounds conducted a hypobaric training session to an altitude up to 22,000 ft (FL220) or 6,705 meters, where heart rate (HR), breathing rate (BR) and core temperature (Tc) were monitored with the use of a chest strap sensor pre and post HH exposure. A pulse oximeter registered levels of saturation of oxygen (SpO2), number and duration of desaturations during the HH chamber flight. Hypoxia symptoms as described by the SACs during the HH training session were also registered. This data allowed to generate a preliminary predictive model of the oxygen desaturation and O2 pressure curve for each subject, which consists of a sixth-order polynomial fit during exposure, and a fifth or fourth-order polynomial fit during recovery. Data analysis showed that HR and BR showed no significant differences between pre and post HH exposure in most of the SACs, while Tc measures showed slight but consistent decrement changes. All subjects registered SpO2 greater than 94% for the majority of their individual HH exposures, but all of them presented at least one clinically significant desaturation (SpO2 < 85% for more than 5 seconds) and half of the individuals showed SpO2 below 87% for at least 30% of their HH exposure time. Finally, real time collection of HH symptoms presented temperature somatosensory perceptions (SP) for 65% of individuals, and task-focus issues for 52.5% of individuals as the most common HH indications. 95% of the subjects experienced HH onset symptoms below FL180; all participants achieved full recovery of HH symptoms within 1 minute of donning their O2 mask. The current HH study performed on this group of individuals suggests a rapid and fully reversible physiologic response after HH exposure as expected and obtained in previous studies. Our data showed consistent results between predicted versus observed SpO2 curves during HH suggesting a mathematical function that may be used to model HH performance deficiencies. During the HH study, real-time HH symptoms were registered providing evidenced SP and task focusing as the earliest and most common indicators. Finally, an assessment of HH signs of symptoms in a group of heterogeneous, non-pilot individuals showed similar results to previous studies in homogeneous populations of pilots.

Keywords: slow onset hypoxia, hypobaric chamber training, altitude sickness, symptoms and altitude, pressure cabin

Procedia PDF Downloads 116
205 Applying an Automatic Speech Intelligent System to the Health Care of Patients Undergoing Long-Term Hemodialysis

Authors: Kuo-Kai Lin, Po-Lun Chang

Abstract:

Research Background and Purpose: Following the development of the Internet and multimedia, the Internet and information technology have become crucial avenues of modern communication and knowledge acquisition. The advantages of using mobile devices for learning include making learning borderless and accessible. Mobile learning has become a trend in disease management and health promotion in recent years. End-stage renal disease (ESRD) is an irreversible chronic disease, and patients who do not receive kidney transplants can only rely on hemodialysis or peritoneal dialysis to survive. Due to the complexities in caregiving for patients with ESRD that stem from their advanced age and other comorbidities, the patients’ incapacity of self-care leads to an increase in the need to rely on their families or primary caregivers, although whether the primary caregivers adequately understand and implement patient care is a topic of concern. Therefore, this study explored whether primary caregivers’ health care provisions can be improved through the intervention of an automatic speech intelligent system, thereby improving the objective health outcomes of patients undergoing long-term dialysis. Method: This study developed an automatic speech intelligent system with healthcare functions such as health information voice prompt, two-way feedback, real-time push notification, and health information delivery. Convenience sampling was adopted to recruit eligible patients from a hemodialysis center at a regional teaching hospital as research participants. A one-group pretest-posttest design was adopted. Descriptive and inferential statistics were calculated from the demographic information collected from questionnaires answered by patients and primary caregivers, and from a medical record review, a health care scale (recorded six months before and after the implementation of intervention measures), a subjective health assessment, and a report of objective physiological indicators. The changes in health care behaviors, subjective health status, and physiological indicators before and after the intervention of the proposed automatic speech intelligent system were then compared. Conclusion and Discussion: The preliminary automatic speech intelligent system developed in this study was tested with 20 pretest patients at the recruitment location, and their health care capacity scores improved from 59.1 to 72.8; comparisons through a nonparametric test indicated a significant difference (p < .01). The average score for their subjective health assessment rose from 2.8 to 3.3. A survey of their objective physiological indicators discovered that the compliance rate for the blood potassium level was the most significant indicator; its average compliance rate increased from 81% to 94%. The results demonstrated that this automatic speech intelligent system yielded a higher efficacy for chronic disease care than did conventional health education delivered by nurses. Therefore, future efforts will continue to increase the number of recruited patients and to refine the intelligent system. Future improvements to the intelligent system can be expected to enhance its effectiveness even further.

Keywords: automatic speech intelligent system for health care, primary caregiver, long-term hemodialysis, health care capabilities, health outcomes

Procedia PDF Downloads 110
204 Monitoring the Production of Large Composite Structures Using Dielectric Tool Embedded Capacitors

Authors: Galatee Levadoux, Trevor Benson, Chris Worrall

Abstract:

With the rise of public awareness on climate change comes an increasing demand for renewable sources of energy. As a result, the wind power sector is striving to manufacture longer, more efficient and reliable wind turbine blades. Currently, one of the leading causes of blade failure in service is improper cure of the resin during manufacture. The infusion process creating the main part of the composite blade structure remains a critical step that is yet to be monitored in real time. This stage consists of a viscous resin being drawn into a mould under vacuum, then undergoing a curing reaction until solidification. Successful infusion assumes the resin fills all the voids and cures completely. Given that the electrical properties of the resin change significantly during its solidification, both the filling of the mould and the curing reaction are susceptible to be followed using dieletrometry. However, industrially available dielectrics sensors are currently too small to monitor the entire surface of a wind turbine blade. The aim of the present research project is to scale up the dielectric sensor technology and develop a device able to monitor the manufacturing process of large composite structures, assessing the conformity of the blade before it even comes out of the mould. An array of flat copper wires acting as electrodes are embedded in a polymer matrix fixed in an infusion mould. A multi-frequency analysis from 1 Hz to 10 kHz is performed during the filling of the mould with an epoxy resin and the hardening of the said resin. By following the variations of the complex admittance Y*, the filling of the mould and curing process are monitored. Results are compared to numerical simulations of the sensor in order to validate a virtual cure-monitoring system. The results obtained by drawing glycerol on top of the copper sensor displayed a linear relation between the wetted length of the sensor and the complex admittance measured. Drawing epoxy resin on top of the sensor and letting it cure at room temperature for 24 hours has provided characteristic curves obtained when conventional interdigitated sensor are used to follow the same reaction. The response from the developed sensor has shown the different stages of the polymerization of the resin, validating the geometry of the prototype. The model created and analysed using COMSOL has shown that the dielectric cure process can be simulated, so long as a sufficient time and temperature dependent material properties can be determined. The model can be used to help design larger sensors suitable for use with full-sized blades. The preliminary results obtained with the sensor prototype indicate that the infusion and curing process of an epoxy resin can be followed with the chosen configuration on a scale of several decimeters. Further work is to be devoted to studying the influence of the sensor geometry and the infusion parameters on the results obtained. Ultimately, the aim is to develop a larger scale sensor able to monitor the flow and cure of large composite panels industrially.

Keywords: composite manufacture, dieletrometry, epoxy, resin infusion, wind turbine blades

Procedia PDF Downloads 166
203 Consumers and Voters’ Choice: Two Different Contexts with a Powerful Behavioural Parallel

Authors: Valentina Dolmova

Abstract:

What consumers choose to buy and who voters select on election days are two questions that have captivated the interest of both academics and practitioners for many decades. The importance of understanding what influences the behavior of those groups and whether or not we can predict or control it fuels a steady stream of research in a range of fields. By looking only at the past 40 years, more than 70 thousand scientific papers have been published in each field – consumer behavior and political psychology, respectively. From marketing, economics, and the science of persuasion to political and cognitive psychology - we have all remained heavily engaged. The ever-evolving technology, inevitable socio-cultural shifts, global economic conditions, and much more play an important role in choice-equations regardless of context. On one hand, this makes the research efforts always relevant and needed. On the other, the relatively low number of cross-field collaborations, which seem to be picking up only in more in recent years, makes the existing findings isolated into framed bubbles. By performing systematic research across both areas of psychology and building a parallel between theories and factors of influence, however, we find that there is not only a definitive common ground between the behaviors of consumers and voters but that we are moving towards a global model of choice. This means that the lines between contexts are fading which has a direct implication on what we should focus on when predicting or navigating buyers and voters’ behavior. Internal and external factors in four main categories determine the choices we make as consumers and as voters. Together, personal, psychological, social, and cultural create a holistic framework through which all stimuli in relation to a particular product or a political party get filtered. The analogy “consumer-voter” solidifies further. Leading academics suggest that this fundamental parallel is the key to managing successfully political and consumer brands alike. However, we distinguish additional four key stimuli that relate to those factor categories (1/ opportunity costs; 2/the memory of the past; 3/recognisable figures/faces and 4/conflict) arguing that the level of expertise a person has determines the prevalence of factors or specific stimuli. Our efforts take into account global trends such as the establishment of “celebrity politics” and the image of “ethically concerned consumer brands” which bridge the gap between contexts to an even greater extent. Scientists and practitioners are pushed to accept the transformative nature of both fields in social psychology. Existing blind spots as well as the limited number of research conducted outside the American and European societies open up space for more collaborative efforts in this highly demanding and lucrative field. A mixed method of research tests three main hypotheses, the first two of which are focused on the level of irrelevance of context when comparing voting or consumer behavior – both from the factors and stimuli lenses, the third on determining whether or not the level of expertise in any field skews the weight of what prism we are more likely to choose when evaluating options.

Keywords: buyers’ behaviour, decision-making, voters’ behaviour, social psychology

Procedia PDF Downloads 154
202 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 144
201 Developing Effective Strategies to Reduce Hiv, Aids and Sexually Transmitted Infections, Nakuru, Kenya

Authors: Brian Bacia, Esther Githaiga, Teresia Kabucho, Paul Moses Ndegwa, Lucy Gichohi

Abstract:

Purpose: The aim of the study is to ensure an appropriate mix of evidence-based prevention strategies geared towards the reduction of new HIV infections and the incidence of Sexually transmitted Illnesses Background: In Nakuru County, more than 90% of all HIV-infected patients are adults and on a single-dose medication-one pill that contains a combination of several different HIV drugs. Nakuru town has been identified as the hardest hit by HIV/Aids in the County according to the latest statistics from the County Aids and STI group, with a prevalence rate of 5.7 percent attributed to the high population and an active urban center. Method: 2 key studies were carried out to provide evidence for the effectiveness of antiretroviral therapy (ART) when used optimally on preventing sexual transmission of HIV. Discussions based on an examination, assessments of successes in planning, program implementation, and ultimate impact of prevention and treatment were undertaken involving health managers, health workers, community health workers, and people living with HIV/AIDS between February -August 2021. Questionnaires were carried out by a trained duo on ethical procedures at 15 HIV treatment clinics targeting patients on ARVs and caregivers on ARV prevention and treatment of pediatric HIV infection. Findings: Levels of AIDS awareness are extremely high. Advances in HIV treatment have led to an enhanced understanding of the virus, improved care of patients, and control of the spread of drug-resistant HIV. There has been a tremendous increase in the number of people living with HIV having access to life-long antiretroviral drugs (ARV), mostly on generic medicines. Healthcare facilities providing treatment are stressed challenging the administration of the drugs, which require a clinical setting. Women find it difficult to take a daily pill which reduces the effectiveness of the medicine. ART adherence can be strengthened largely through the use of innovative digital technology. The case management approach is useful in resource-limited settings. The county has made tremendous progress in mother-to-child transmission reduction through enhanced early antenatal care (ANC) attendance and mapping of pregnant women Recommendations: Treatment reduces the risk of transmission to the child during pregnancy, labor, and delivery. Promote research of medicines through patients and community engagement. Reduce the risk of transmission through breastfeeding. Enhance testing strategies and strengthen health systems for sustainable HIV service delivery. Need exists for improved antenatal care and delivery by skilled birth attendants. Develop a comprehensive maternal reproductive health policy covering equitability, efficient and effective delivery of services. Put in place referral systems.

Keywords: evidence-based prevention strategies, service delivery, human management, integrated approach

Procedia PDF Downloads 88
200 Treatment Process of Sludge from Leachate with an Activated Sludge System and Extended Aeration System

Authors: A. Chávez, A. Rodríguez, F. Pinzón

Abstract:

Society is concerned about measures of environmental, economic and social impacts generated in the solid waste disposal. These places of confinement, also known as landfills, are locations where problems of pollution and damage to human health are reduced. They are technically designed and operated, using engineering principles, storing the residue in a small area, compact it to reduce volume and covering them with soil layers. Problems preventing liquid (leachate) and gases produced by the decomposition of organic matter. Despite planning and site selection for disposal, monitoring and control of selected processes, remains the dilemma of the leachate as extreme concentration of pollutants, devastating soil, flora and fauna; aggressive processes requiring priority attention. A biological technology is the activated sludge system, used for tributaries with high pollutant loads. Since transforms biodegradable dissolved and particulate matter into CO2, H2O and sludge; transform suspended and no Settleable solids; change nutrients as nitrogen and phosphorous; and degrades heavy metals. The microorganisms that remove organic matter in the processes are in generally facultative heterotrophic bacteria, forming heterogeneous populations. Is possible to find unicellular fungi, algae, protozoa and rotifers, that process the organic carbon source and oxygen, as well as the nitrogen and phosphorus because are vital for cell synthesis. The mixture of the substrate, in this case sludge leachate, molasses and wastewater is maintained ventilated by mechanical aeration diffusers. Considering as the biological processes work to remove dissolved material (< 45 microns), generating biomass, easily obtained by decantation processes. The design consists of an artificial support and aeration pumps, favoring develop microorganisms (denitrifying) using oxygen (O) with nitrate, resulting in nitrogen (N) in the gas phase. Thus, avoiding negative effects of the presence of ammonia or phosphorus. Overall the activated sludge system includes about 8 hours of hydraulic retention time, which does not prevent the demand for nitrification, which occurs on average in a value of MLSS 3,000 mg/L. The extended aeration works with times greater than 24 hours detention; with ratio of organic load/biomass inventory under 0.1; and average stay time (sludge age) more than 8 days. This project developed a pilot system with sludge leachate from Doña Juana landfill - RSDJ –, located in Bogota, Colombia, where they will be subjected to a process of activated sludge and extended aeration through a sequential Bach reactor - SBR, to be dump in hydric sources, avoiding ecological collapse. The system worked with a dwell time of 8 days, 30 L capacity, mainly by removing values of BOD and COD above 90%, with initial data of 1720 mg/L and 6500 mg/L respectively. Motivating the deliberate nitrification is expected to be possible commercial use diffused aeration systems for sludge leachate from landfills.

Keywords: sludge, landfill, leachate, SBR

Procedia PDF Downloads 272
199 Using Business Simulations and Game-Based Learning for Enterprise Resource Planning Implementation Training

Authors: Carin Chuang, Kuan-Chou Chen

Abstract:

An Enterprise Resource Planning (ERP) system is an integrated information system that supports the seamless integration of all the business processes of a company. Implementing an ERP system can increase efficiencies and decrease the costs while helping improve productivity. Many organizations including large, medium and small-sized companies have already adopted an ERP system for decades. Although ERP system can bring competitive advantages to organizations, the lack of proper training approach in ERP implementation is still a major concern. Organizations understand the importance of ERP training to adequately prepare managers and users. The low return on investment, however, for the ERP training makes the training difficult for knowledgeable workers to transfer what is learned in training to the jobs at workplace. Inadequate and inefficient ERP training limits the value realization and success of an ERP system. That is the need to call for a profound change and innovation for ERP training in both workplace at industry and the Information Systems (IS) education in academia. The innovated ERP training approach can improve the users’ knowledge in business processes and hands-on skills in mastering ERP system. It also can be instructed as educational material for IS students in universities. The purpose of the study is to examine the use of ERP simulation games via the ERPsim system to train the IS students in learning ERP implementation. The ERPsim is the business simulation game developed by ERPsim Lab at HEC Montréal, and the game is a real-life SAP (Systems Applications and Products) ERP system. The training uses the ERPsim system as the tool for the Internet-based simulation games and is designed as online student competitions during the class. The competitions involve student teams with the facilitation of instructor and put the students’ business skills to the test via intensive simulation games on a real-world SAP ERP system. The teams run the full business cycle of a manufacturing company while interacting with suppliers, vendors, and customers through sending and receiving orders, delivering products and completing the entire cash-to-cash cycle. To learn a range of business skills, student needs to adopt individual business role and make business decisions around the products and business processes. Based on the training experiences learned from rounds of business simulations, the findings show that learners have reduced risk in making mistakes that help learners build self-confidence in problem-solving. In addition, the learners’ reflections from their mistakes can speculate the root causes of the problems and further improve the efficiency of the training. ERP instructors teaching with the innovative approach report significant improvements in student evaluation, learner motivation, attendance, engagement as well as increased learner technology competency. The findings of the study can provide ERP instructors with guidelines to create an effective learning environment and can be transferred to a variety of other educational fields in which trainers are migrating towards a more active learning approach.

Keywords: business simulations, ERP implementation training, ERPsim, game-based learning, instructional strategy, training innovation

Procedia PDF Downloads 139
198 Novel Numerical Technique for Dusty Plasma Dynamics (Yukawa Liquids): Microfluidic and Role of Heat Transport

Authors: Aamir Shahzad, Mao-Gang He

Abstract:

Currently, dusty plasmas motivated the researchers' widespread interest. Since the last two decades, substantial efforts have been made by the scientific and technological community to investigate the transport properties and their nonlinear behavior of three-dimensional and two-dimensional nonideal complex (dusty plasma) liquids (NICDPLs). Different calculations have been made to sustain and utilize strongly coupled NICDPLs because of their remarkable scientific and industrial applications. Understanding of the thermophysical properties of complex liquids under various conditions is of practical interest in the field of science and technology. The determination of thermal conductivity is also a demanding question for thermophysical researchers, due to some reasons; very few results are offered for this significant property. Lack of information of the thermal conductivity of dense and complex liquids at different parameters related to the industrial developments is a major barrier to quantitative knowledge of the heat flux flow from one medium to another medium or surface. The exact numerical investigation of transport properties of complex liquids is a fundamental research task in the field of thermophysics, as various transport data are closely related with the setup and confirmation of equations of state. A reliable knowledge of transport data is also important for an optimized design of processes and apparatus in various engineering and science fields (thermoelectric devices), and, in particular, the provision of precise data for the parameters of heat, mass, and momentum transport is required. One of the promising computational techniques, the homogenous nonequilibrium molecular dynamics (HNEMD) simulation, is over viewed with a special importance on the application to transport problems of complex liquids. This proposed work is particularly motivated by the FIRST TIME to modify the problem of heat conduction equations leads to polynomial velocity and temperature profiles algorithm for the investigation of transport properties with their nonlinear behaviors in the NICDPLs. The aim of proposed work is to implement a NEMDS algorithm (Poiseuille flow) and to delve the understanding of thermal conductivity behaviors in Yukawa liquids. The Yukawa system is equilibrated through the Gaussian thermostat in order to maintain the constant system temperature (canonical ensemble ≡ NVT)). The output steps will be developed between 3.0×105/ωp and 1.5×105/ωp simulation time steps for the computation of λ data. The HNEMD algorithm shows that the thermal conductivity is dependent on plasma parameters and the minimum value of lmin shifts toward higher G with an increase in k, as expected. New investigations give more reliable simulated data for the plasma conductivity than earlier known simulation data and generally the plasma λ0 by 2%-20%, depending on Γ and κ. It has been shown that the obtained results at normalized force field are in satisfactory agreement with various earlier simulation results. This algorithm shows that the new technique provides more accurate results with fast convergence and small size effects over a wide range of plasma states.

Keywords: molecular dynamics simulation, thermal conductivity, nonideal complex plasma, Poiseuille flow

Procedia PDF Downloads 274
197 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level

Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown

Abstract:

‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.

Keywords: data integration, data linkage, health planning, health services research

Procedia PDF Downloads 216
196 Application of Infrared Thermal Imaging, Eye Tracking and Behavioral Analysis for Deception Detection

Authors: Petra Hypšová, Martin Seitl

Abstract:

One of the challenges of forensic psychology is to detect deception during a face-to-face interview. In addition to the classical approaches of monitoring the utterance and its components, detection is also sought by observing behavioral and physiological changes that occur as a result of the increased emotional and cognitive load caused by the production of distorted information. Typical are changes in facial temperature, eye movements and their fixation, pupil dilation, emotional micro-expression, heart rate and its variability. Expanding technological capabilities have opened the space to detect these psychophysiological changes and behavioral manifestations through non-contact technologies that do not interfere with face-to-face interaction. Non-contact deception detection methodology is still in development, and there is a lack of studies that combine multiple non-contact technologies to investigate their accuracy, as well as studies that show how different types of lies produced by different interviewers affect physiological and behavioral changes. The main objective of this study is to apply a specific non-contact technology for deception detection. The next objective is to investigate scenarios in which non-contact deception detection is possible. A series of psychophysiological experiments using infrared thermal imaging, eye tracking and behavioral analysis with FaceReader 9.0 software was used to achieve our goals. In the laboratory experiment, 16 adults (12 women, 4 men) between 18 and 35 years of age (SD = 4.42) were instructed to produce alternating prepared and spontaneous truths and lies. The baseline of each proband was also measured, and its results were compared to the experimental conditions. Because the personality of the examiner (particularly gender and facial appearance) to whom the subject is lying can influence physiological and behavioral changes, the experiment included four different interviewers. The interviewer was represented by a photograph of a face that met the required parameters in terms of gender and facial appearance (i.e., interviewer likability/antipathy) to follow standardized procedures. The subject provided all information to the simulated interviewer. During follow-up analyzes, facial temperature (main ROIs: forehead, cheeks, the tip of the nose, chin, and corners of the eyes), heart rate, emotional expression, intensity and fixation of eye movements and pupil dilation were observed. The results showed that the variables studied varied with respect to the production of prepared truths and lies versus the production of spontaneous truths and lies, as well as the variability of the simulated interviewer. The results also supported the assumption of variability in physiological and behavioural values during the subject's resting state, the so-called baseline, and the production of prepared and spontaneous truths and lies. A series of psychophysiological experiments provided evidence of variability in the areas of interest in the production of truths and lies to different interviewers. The combination of technologies used also led to a comprehensive assessment of the physiological and behavioral changes associated with false and true statements. The study presented here opens the space for further research in the field of lie detection with non-contact technologies.

Keywords: emotional expression decoding, eye-tracking, functional infrared thermal imaging, non-contact deception detection, psychophysiological experiment

Procedia PDF Downloads 99
195 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology

Authors: Amarendar Reddy Addula

Abstract:

Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.

Keywords: artificial intelligence, ethics & human rights issues, laws, international laws

Procedia PDF Downloads 95