Search results for: geospatial technology competency model
1403 DEMs: A Multivariate Comparison Approach
Authors: Juan Francisco Reinoso Gordo, Francisco Javier Ariza-López, José Rodríguez Avi, Domingo Barrera Rosillo
Abstract:
The evaluation of the quality of a data product is based on the comparison of the product with a reference of greater accuracy. In the case of MDE data products, quality assessment usually focuses on positional accuracy and few studies consider other terrain characteristics, such as slope and orientation. The proposal that is made consists of evaluating the similarity of two DEMs (a product and a reference), through the joint analysis of the distribution functions of the variables of interest, for example, elevations, slopes and orientations. This is a multivariable approach that focuses on distribution functions, not on single parameters such as mean values or dispersions (e.g. root mean squared error or variance). This is considered to be a more holistic approach. The use of the Kolmogorov-Smirnov test is proposed due to its non-parametric nature, since the distributions of the variables of interest cannot always be adequately modeled by parametric models (e.g. the Normal distribution model). In addition, its application to the multivariate case is carried out jointly by means of a single test on the convolution of the distribution functions of the variables considered, which avoids the use of corrections such as Bonferroni when several statistics hypothesis tests are carried out together. In this work, two DEM products have been considered, DEM02 with a resolution of 2x2 meters and DEM05 with a resolution of 5x5 meters, both generated by the National Geographic Institute of Spain. DEM02 is considered as the reference and DEM05 as the product to be evaluated. In addition, the slope and aspect derived models have been calculated by GIS operations on the two DEM datasets. Through sample simulation processes, the adequate behavior of the Kolmogorov-Smirnov statistical test has been verified when the null hypothesis is true, which allows calibrating the value of the statistic for the desired significance value (e.g. 5%). Once the process has been calibrated, the same process can be applied to compare the similarity of different DEM data sets (e.g. the DEM05 versus the DEM02). In summary, an innovative alternative for the comparison of DEM data sets based on a multinomial non-parametric perspective has been proposed by means of a single Kolmogorov-Smirnov test. This new approach could be extended to other DEM features of interest (e.g. curvature, etc.) and to more than three variablesKeywords: data quality, DEM, kolmogorov-smirnov test, multivariate DEM comparison
Procedia PDF Downloads 1181402 Engineering a Tumor Extracellular Matrix Towards an in vivo Mimicking 3D Tumor Microenvironment
Authors: Anna Cameron, Chunxia Zhao, Haofei Wang, Yun Liu, Guang Ze Yang
Abstract:
Since the first publication in 1775, cancer research has built a comprehensive understanding of how cellular components of the tumor niche promote disease development. However, only within the last decade has research begun to establish the impact of non-cellular components of the niche, particularly the extracellular matrix (ECM). The ECM, a three-dimensional scaffold that sustains the tumor microenvironment, plays a crucial role in disease progression. Cancer cells actively deregulate and remodel the ECM to establish a tumor-promoting environment. Recent work has highlighted the need to further our understanding of the complexity of this cancer-ECM relationship. In vitro models use hydrogels to mimic the ECM, as hydrogel matrices offer biological compatibility and stability needed for long term cell culture. However, natural hydrogels are being used in these models verbatim, without tuning their biophysical characteristics to achieve pathophysiological relevance, thus limiting their broad use within cancer research. The biophysical attributes of these gels dictate cancer cell proliferation, invasion, metastasis, and therapeutic response. Evaluating the three most widely used natural hydrogels, Matrigel, collagen, and agarose gel, the permeability, stiffness, and pore-size of each gel were measured and compared to the in vivo environment. The pore size of all three gels fell between 0.5-6 µm, which coincides with the 0.1-5 µm in vivo pore size found in the literature. However, the stiffness for hydrogels able to support cell culture ranged between 0.05 and 0.3 kPa, which falls outside the range of 0.3-20,000 kPa reported in the literature for an in vivo ECM. Permeability was ~100x greater than in vivo measurements, due in large part to the lack of cellular components which impede permeation. Though, these measurements prove important when assessing therapeutic particle delivery, as the ECM permeability decreased with increasing particle size, with 100 nm particles exhibiting a fifth of the permeability of 10 nm particles. This work explores ways of adjusting the biophysical characteristics of hydrogels by changing protein concentration and the trade-off, which occurs due to the interdependence of these factors. The global aim of this work is to produce a more pathophysiologically relevant model for each tumor type.Keywords: cancer, extracellular matrix, hydrogel, microfluidic
Procedia PDF Downloads 941401 Knowledge Loss Risk Assessment for Departing Employees: An Exploratory Study
Authors: Muhammad Saleem Ullah Khan Sumbal, Eric Tsui, Ricky Cheong, Eric See To
Abstract:
Organizations are posed to a threat of valuable knowledge loss when employees leave either due to retirement, resignation, job change or because of disabilities e.g. death, etc. Due to changing economic conditions, globalization, and aging workforce, organizations are facing challenges regarding retention of valuable knowledge. On the one hand, large number of employees are going to retire in the organizations whereas on the other hand, younger generation does not want to work in a company for a long time and there is an increasing trend of frequent job change among the new generation. Because of these factors, organizations need to make sure that they capture the knowledge of employee before (s)he walks out of the door. The first step in this process is to know what type of knowledge employee possesses and whether this knowledge is important for the organization. Researchers reveal in the literature that despite the serious consequences of knowledge loss in terms of organizational productivity and competitive advantage, there has not been much work done in the area of knowledge loss assessment of departing employees. An important step in the knowledge retention process is to determine the critical ‘at risk’ knowledge. Thus, knowledge loss risk assessment is a process by which organizations can gauge the importance of knowledge of the departing employee. The purpose of this study is to explore this topic of knowledge loss risk assessment by conducting a qualitative study in oil and gas sector. By engaging in dialogues with managers and executives of the organizations through in-depth interviews and adopting a grounded methodology approach, the research will explore; i) Are there any measures adopted by organizations to assess the risk of knowledge loss from departing employees? ii) Which factors are crucial for knowledge loss assessment in the organizations? iii) How can we prioritize the employees for knowledge retention according to their criticality? Grounded theory approach is used when there is not much knowledge available in the area under research and thus new knowledge is generated about the topic through an in-depth exploration of the topic by using methods such as interviews and using a systematic approach to analyze the data. The outcome of the study will generate a model for the risk of knowledge loss through factors such as the likelihood of knowledge loss, the consequence/impact of knowledge loss and quality of the knowledge loss of departing employees. Initial results show that knowledge loss assessment is quite crucial for the organizations and it helps in determining what types of knowledge employees possess e.g. organizations knowledge, subject matter expertise or relationships knowledge. Based on that, it can be assessed which employee is more important for the organizations and how to prioritize the knowledge retention process for departing employees.Keywords: knowledge loss, risk assessment, departing employees, Hong Kong organizations
Procedia PDF Downloads 4121400 Quantitative Seismic Interpretation in the LP3D Concession, Central of the Sirte Basin, Libya
Authors: Tawfig Alghbaili
Abstract:
LP3D Field is located near the center of the Sirt Basin in the Marada Trough approximately 215 km south Marsa Al Braga City. The Marada Trough is bounded on the west by a major fault, which forms the edge of the Beda Platform, while on the east, a bounding fault marks the edge of the Zelten Platform. The main reservoir in the LP3D Field is Upper Paleocene Beda Formation. The Beda Formation is mainly limestone interbedded with shale. The reservoir average thickness is 117.5 feet. To develop a better understanding of the characterization and distribution of the Beda reservoir, quantitative seismic data interpretation has been done, and also, well logs data were analyzed. Six reflectors corresponding to the tops of the Beda, Hagfa Shale, Gir, Kheir Shale, Khalifa Shale, and Zelten Formations were picked and mapped. Special work was done on fault interpretation part because of the complexities of the faults at the structure area. Different attribute analyses were done to build up more understanding of structures lateral extension and to view a clear image of the fault blocks. Time to depth conversion was computed using velocity modeling generated from check shot and sonic data. The simplified stratigraphic cross-section was drawn through the wells A1, A2, A3, and A4-LP3D. The distribution and the thickness variations of the Beda reservoir along the study area had been demonstrating. Petrophysical analysis of wireline logging also was done and Cross plots of some petrophysical parameters are generated to evaluate the lithology of reservoir interval. Structure and Stratigraphic Framework was designed and run to generate different model like faults, facies, and petrophysical models and calculate the reservoir volumetric. This study concluded that the depth structure map of the Beda formation shows the main structure in the area of study, which is north to south faulted anticline. Based on the Beda reservoir models, volumetric for the base case has been calculated and it has STOIIP of 41MMSTB and Recoverable oil of 10MMSTB. Seismic attributes confirm the structure trend and build a better understanding of the fault system in the area.Keywords: LP3D Field, Beda Formation, reservoir models, Seismic attributes
Procedia PDF Downloads 2231399 A Land Use Decision-Making System to Stop Sprawl and Build Holistic, Organic Communities
Authors: Kirk Wickersham
Abstract:
Introduction: Sprawl has been built for the auto. This project anticipates the adoption of autonomous vehicle technology to both enable and require a new urban form – a modern version of the organic form humans have developed over the millennia. It proposes a radically new land use decision-making system to stop further sprawl and channel growth into these new communities. Methodology: For the past 80 years we have built sprawl and strip commercial development – intense commercial and multifamily on the periphery, low-density housing in the center, repeated indefinitely across the landscape. Sprawl is designed to accommodate the auto, and we need an auto to live there. That will change. Within a decade, autonomous vehicles (AVs) and especially robotaxis will replace human-driven vehicles (HDVs). These new vehicles will require a new transportation network that will both enable and require a new urban form. It will resemble the organic urban form developed over millennia – high-intensity uses in the center, surrounded by neighborhoods, with a defined outer boundary – a city limit. The project dubs this new community a HOME Town Holistic, Organic, Market-driven, and Ergonomic. It will offer a better quality of life at a lower public and private cost. (While designing a transportation system primarily for alternative vehicles is not a requirement for creating a holistic, organic community, but it is the main reason for the reduced cost of housing, transportation, and public services). Sprawl is created by our existing land use decision-making system – local governments approving one incremental project at a time. To create these new communities, we will need a radically different system. This means regional planning, eliminating development-by-right zoning and incremental development approvals, new standards for roadways and parking, selection of a lead developer, designating and master planning a new community site, channeling development into the new community, and providing equity for the landowners who have been left out of the process. This new process is based on and inspired by state regulation of oilfield development, called “unitization and pooling.” It is designed to fit within standard state land use enabling legislation, although the states vary on statutory language and case law. The specific implementation program will vary from one community to the next depending on opportunities and constraints to development, legal and political acceptance. Major Findings: The problems of sprawl and strip commercial development are well known. The quality of life and efficiencies in a holistic, organic, ergonomic community have also been well known for centuries. Now, an integrated planning, legal and regulatory process has been developed to replace sprawl with a 21st century version of these communities. Conclusion: This project offers the opportunity to transform the urban landscape, and urban life, in the 21st century. The process is ready for implementation and the author invites inquiries from developers and communities.Keywords: autonomous vehicles, community, home town, land use decision-making system, quality of life, sprawl, strip commercial
Procedia PDF Downloads 71398 Hydrogeomatic System for the Economic Evaluation of Damage by Flooding in Mexico
Authors: Alondra Balbuena Medina, Carlos Diaz Delgado, Aleida Yadira Vilchis Fránces
Abstract:
In Mexico, each year news is disseminated about the ravages of floods, such as the total loss of housing, damage to the fields; the increase of the costs of the food, derived from the losses of the harvests, coupled with health problems such as skin infection, etc. In addition to social problems such as delinquency, damage in education institutions and the population in general. The flooding is a consequence of heavy rains, tropical storms and or hurricanes that generate excess water in drainage systems that exceed its capacity. In urban areas, heavy rains can be one of the main factors in causing flooding, in addition to excessive precipitation, dam breakage, and human activities, for example, excessive garbage in the strainers. In agricultural areas, these can hardly achieve large areas of cultivation. It should be mentioned that for both areas, one of the significant impacts of floods is that they can permanently affect the livelihoods of many families, cause damage, for example in their workplaces such as farmlands, commercial or industry areas and where services are provided. In recent years, Information and Communication Technologies (ICT) have had an accelerated development, being reflected in the growth and the exponential evolution of the innovation giving; as a result, the daily generation of new technologies, updates, and applications. Innovation in the development of Information Technology applications has impacted on all areas of human activity. They influence all the orders of life of individuals, reconfiguring the way of perceiving and analyzing the world such as, for instance, interrelating with people as individuals and as a society, in the economic, political, social, cultural, educational, environmental, etc. Therefore the present work describes the creation of a system of calculation of flood costs for housing areas, retail establishments and agricultural areas from the Mexican Republic, based on the use and application of geotechnical tools being able to be useful for the benefit of the sectors of public, education and private. To generate analysis of hydrometereologic affections and with the obtained results to realize the Geoinformatics tool was constructed from two different points of view: the geoinformatic (design and development of GIS software) and the methodology of flood damage validation in order to integrate a tool that provides the user the monetary estimate of the effects caused by the floods. With information from the period 2000-2014, the functionality of the application was corroborated. For the years 2000 to 2009 only the analysis of the agricultural and housing areas was carried out, incorporating for the commercial establishment's information of the period 2010 - 2014. The method proposed for the resolution of this research project is a fundamental contribution to society, in addition to the tool itself. Therefore, it can be summarized that the problems that are in the physical-geographical environment, conceiving them from the point of view of the spatial analysis, allow to offer different alternatives of solution and also to open up slopes towards academia and research.Keywords: floods, technological innovation, monetary estimation, spatial analysis
Procedia PDF Downloads 2261397 Incidence and Predictors of Mortality Among HIV Positive Children on Art in Public Hospitals of Harer Town, Enrolled From 2011 to 2021
Authors: Getahun Nigusie
Abstract:
Background; antiretroviral treatment reduce HIV-related morbidity, and prolonged survival of patients however, there is lack of up-to-date information concerning the treatment long term effect on the survival of HIV positive children especially in the study area. Objective: To assess incidence and predictors of mortality among HIV positive children on ART in public hospitals of Harer town who were enrolled from 2011 to 2021. Methodology: Institution based retrospective cohort study was conducted among 429 HIV positive children enrolled in ART clinic from January 1st 2011 to December30th 2021. Data were collected from medical cards by using a data extraction form, Descriptive analyses were used to Summarized the results, and life table was used to estimate survival probability at specific point of time after introduction of ART. Kaplan Meier survival curve together with log rank test was used to compare survival between different categories of covariates, and Multivariate Cox-proportional hazard regression model was used to estimate adjusted Hazard rate. Variables with p-values ≤0.25 in bivariable analysis were candidates to the multivariable analysis. Finally, variables with p-values < 0.05 were considered as significant variables. Results: The study participants had followed for a total of 2549.6 child-years (30596 child months) with an overall mortality rate of 1.5 (95% CI: 1.1, 2.04) per 100 child-years. Their median survival time was 112 months (95% CI: 101–117). There were 38 children with unknown outcome, 39 deaths, and 55 children transfer out to different facility. The overall survival at 6, 12, 24, 48 months were 98%, 96%, 95%, 94% respectively. being in WHO clinical Stage four (AHR=4.55, 95% CI:1.36, 15.24), having anemia(AHR=2.56, 95% CI:1.11, 5.93), baseline low absolute CD4 count (AHR=2.95, 95% CI: 1.22, 7.12), stunting (AHR=4.1, 95% CI: 1.11, 15.42), wasting (AHR=4.93, 95% CI: 1.31, 18.76), poor adherence to treatment (AHR=3.37, 95% CI: 1.25, 9.11), having TB infection at enrollment (AHR=3.26, 95% CI: 1.25, 8.49),and no history of change their regimen(AHR=7.1, 95% CI: 2.74, 18.24), were independent predictors of death. Conclusion: more than half of death occurs within 2 years. Prevalent tuberculosis, anemia, wasting, and stunting nutritional status, socioeconomic factors, and baseline opportunistic infection were independent predictors of death. Increasing early screening and managing those predictors are required.Keywords: human immunodeficiency virus-positive children, anti-retroviral therapy, survival, Ethiopia
Procedia PDF Downloads 281396 Investigation of Wind Farm Interaction with Ethiopian Electric Power’s Grid: A Case Study at Ashegoda Wind Farm
Authors: Fikremariam Beyene, Getachew Bekele
Abstract:
Ethiopia is currently on the move with various projects to raise the amount of power generated in the country. The progress observed in recent years indicates this fact clearly and indisputably. The rural electrification program, the modernization of the power transmission system, the development of wind farm is some of the main accomplishments worth mentioning. As it is well known, currently, wind power is globally embraced as one of the most important sources of energy mainly for its environmentally friendly characteristics, and also that once it is installed, it is a source available free of charge. However, integration of wind power plant with an existing network has many challenges that need to be given serious attention. In Ethiopia, a number of wind farms are either installed or are under construction. A series of wind farm is planned to be installed in the near future. Ashegoda Wind farm (13.2°, 39.6°), which is the subject of this study, is the first large scale wind farm under construction with the capacity of 120 MW. The first phase of 120 MW (30 MW) has been completed and is expected to be connected to the grid soon. This paper is concerned with the investigation of the wind farm interaction with the national grid under transient operating condition. The main concern is the fault ride through (FRT) capability of the system when the grid voltage drops to exceedingly low values because of short circuit fault and also the active and reactive power behavior of wind turbines after the fault is cleared. On the wind turbine side, a detailed dynamic modelling of variable speed wind turbine of a 1 MW capacity running with a squirrel cage induction generator and full-scale power electronics converters is done and analyzed using simulation software DIgSILENT PowerFactory. On the Ethiopian electric power corporation side, after having collected sufficient data for the analysis, the grid network is modeled. In the model, a fault ride-through (FRT) capability of the plant is studied by applying 3-phase short circuit on the grid terminal near the wind farm. The results show that the Ashegoda wind farm can ride from voltage deep within a short time and the active and reactive power performance of the wind farm is also promising.Keywords: squirrel cage induction generator, active and reactive power, DIgSILENT PowerFactory, fault ride-through capability, 3-phase short circuit
Procedia PDF Downloads 1791395 Inhibition of Glutamate Carboxypeptidase Activity Protects Retinal Ganglionic Cell Death Induced by Ischemia-Reperfusion by Reducing the Astroglial Activation in Rat
Authors: Dugeree Otgongerel, Kyong Jin Cho, Yu-Han Kim, Sangmee Ahn Jo
Abstract:
Excessive activation of glutamate receptor is thought to be involved in retinal ganglion cell (RGC) death after ischemia- reperfusion damage. Glutamate carboxypeptidase II (GCPII) is an enzyme responsible for the synthesis of glutamate. Several studies showed that inhibition of GCPII prevents or reduces cellular damage in brain diseases. Thus, in this study, we examined the expression of GCPII in rat retina and the role of GCPII in acute high IOP ischemia-reperfusion damage of eye by using a GCPII inhibitor, 2-(phosphonomethyl) pentanedioic acid (2-PMPA). Animal model of ischemia-reperfusion was induced by raising the intraocular pressure for 60 min and followed by reperfusion for 3 days. Rats were randomly divided into four groups: either intra-vitreous injection of 2-PMPA (11 or 110 ng per eye) or PBS after ischemia-reperfusion, 2-PMPA treatment without ischemia-reperfusion and sham-operated normal control. GCPII immunoreactivity in normal rat retina was detected weakly in retinal nerve fiber layer (RNFL) and retinal ganglionic cell layer (RGL) and also inner plexiform layer (IPL) and outer plexiform layer (OPL) strongly where are co-stained with an anti-GFAP antibody, suggesting that GCPII is expressed mostly in Muller and astrocytes. Immunostaining with anti-BRN antibody showed that ischemia- reperfusion caused RGC death (31.5 %) and decreased retinal thickness in all layers of damaged retina, but the treatment of 2-PMPA twice at 0 and 48 hour after reperfusion blocked these retinal damages. GCPII level in RNFL layer was enhanced after ischemia-reperfusion but was blocked by PMPA treatment. This result was confirmed by western blot analysis showing that the level of GCPII protein after ischemia- reperfusion increased by 2.2- fold compared to control, but this increase was blocked almost completely by 110 ng PMPA treatment. Interestingly, GFAP immunoreactivity in the retina after ischemia- reperfusion followed by treatment with PMPA showed similar pattern to GCPII, increase after ischemia-reperfusion but reduction to the normal level by PMPA treatment. Our data demonstrate that increase of GCPII protein level after ischemia-reperfusion injury is likely to cause glial activation and/or retinal cell death which are mediated by glutamate, and GCPII inhibitors may be useful in treatment of retinal disorders in which glutamate excitotoxicity is pathogenic.Keywords: glutamate carboxypepptidase II, glutamate excitotoxicity, ischemia-reperfusion, retinal ganglion cell
Procedia PDF Downloads 3441394 Sound Source Localisation and Augmented Reality for On-Site Inspection of Prefabricated Building Components
Authors: Jacques Cuenca, Claudio Colangeli, Agnieszka Mroz, Karl Janssens, Gunther Riexinger, Antonio D'Antuono, Giuseppe Pandarese, Milena Martarelli, Gian Marco Revel, Carlos Barcena Martin
Abstract:
This study presents an on-site acoustic inspection methodology for quality and performance evaluation of building components. The work focuses on global and detailed sound source localisation, by successively performing acoustic beamforming and sound intensity measurements. A portable experimental setup is developed, consisting of an omnidirectional broadband acoustic source and a microphone array and sound intensity probe. Three main acoustic indicators are of interest, namely the sound pressure distribution on the surface of components such as walls, windows and junctions, the three-dimensional sound intensity field in the vicinity of junctions, and the sound transmission loss of partitions. The measurement data is post-processed and converted into a three-dimensional numerical model of the acoustic indicators with the help of the simultaneously acquired geolocation information. The three-dimensional acoustic indicators are then integrated into an augmented reality platform superimposing them onto a real-time visualisation of the spatial environment. The methodology thus enables a measurement-supported inspection process of buildings and the correction of errors during construction and refurbishment. Two experimental validation cases are shown. The first consists of a laboratory measurement on a full-scale mockup of a room, featuring a prefabricated panel. The latter is installed with controlled defects such as lack of insulation and joint sealing material. It is demonstrated that the combined acoustic and augmented reality tool is capable of identifying acoustic leakages from the building defects and assist in correcting them. The second validation case is performed on a prefabricated room at a near-completion stage in the factory. With the help of the measurements and visualisation tools, the homogeneity of the partition installation is evaluated and leakages from junctions and doors are identified. Furthermore, the integration of acoustic indicators together with thermal and geometrical indicators via the augmented reality platform is shown.Keywords: acoustic inspection, prefabricated building components, augmented reality, sound source localization
Procedia PDF Downloads 3901393 The Regulation of Reputational Information in the Sharing Economy
Authors: Emre Bayamlıoğlu
Abstract:
This paper aims to provide an account of the legal and the regulative aspects of the algorithmic reputation systems with a special emphasis on the sharing economy (i.e., Uber, Airbnb, Lyft) business model. The first section starts with an analysis of the legal and commercial nature of the tripartite relationship among the parties, namely, the host platform, individual sharers/service providers and the consumers/users. The section further examines to what extent an algorithmic system of reputational information could serve as an alternative to legal regulation. Shortcomings are explained and analyzed with specific examples from Airbnb Platform which is a pioneering success in the sharing economy. The following section focuses on the issue of governance and control of the reputational information. The section first analyzes the legal consequences of algorithmic filtering systems to detect undesired comments and how a delicate balance could be struck between the competing interests such as freedom of speech, privacy and the integrity of the commercial reputation. The third section deals with the problem of manipulation by users. Indeed many sharing economy businesses employ certain techniques of data mining and natural language processing to verify consistency of the feedback. Software agents referred as "bots" are employed by the users to "produce" fake reputation values. Such automated techniques are deceptive with significant negative effects for undermining the trust upon which the reputational system is built. The third section is devoted to explore the concerns with regard to data mobility, data ownership, and the privacy. Reputational information provided by the consumers in the form of textual comment may be regarded as a writing which is eligible to copyright protection. Algorithmic reputational systems also contain personal data pertaining both the individual entrepreneurs and the consumers. The final section starts with an overview of the notion of reputation as a communitarian and collective form of referential trust and further provides an evaluation of the above legal arguments from the perspective of public interest in the integrity of reputational information. The paper concludes with certain guidelines and design principles for algorithmic reputation systems, to address the above raised legal implications.Keywords: sharing economy, design principles of algorithmic regulation, reputational systems, personal data protection, privacy
Procedia PDF Downloads 4681392 A Study on the Measurement of Spatial Mismatch and the Influencing Factors of “Job-Housing” in Affordable Housing from the Perspective of Commuting
Authors: Daijun Chen
Abstract:
Affordable housing is subsidized by the government to meet the housing demand of low and middle-income urban residents in the process of urbanization and to alleviate the housing inequality caused by market-based housing reforms. It is a recognized fact that the living conditions of the insured have been improved while constructing the subsidized housing. However, the choice of affordable housing is mostly in the suburbs, where the surrounding urban functions and infrastructure are incomplete, resulting in the spatial mismatch of "jobs-housing" in affordable housing. The main reason for this problem is that the residents of affordable housing are more sensitive to the spatial location of their residence, but their selectivity and controllability to the housing location are relatively weak, which leads to higher commuting costs. Their real cost of living has not been effectively reduced. In this regard, 92 subsidized housing communities in Nanjing, China, are selected as the research sample in this paper. The residents of the affordable housing and their commuting Spatio-temporal behavior characteristics are identified based on the LBS (location-based service) data. Based on the spatial mismatch theory, spatial mismatch indicators such as commuting distance and commuting time are established to measure the spatial mismatch degree of subsidized housing in different districts of Nanjing. Furthermore, the geographically weighted regression model is used to analyze the influencing factors of the spatial mismatch of affordable housing in terms of the provision of employment opportunities, traffic accessibility and supporting service facilities by using spatial, functional and other multi-source Spatio-temporal big data. The results show that the spatial mismatch of affordable housing in Nanjing generally presents a "concentric circle" pattern of decreasing from the central urban area to the periphery. The factors affecting the spatial mismatch of affordable housing in different spatial zones are different. The main reasons are the number of enterprises within 1 km of the affordable housing district and the shortest distance to the subway station. And the low spatial mismatch is due to the diversity of services and facilities. Based on this, a spatial optimization strategy for different levels of spatial mismatch in subsidized housing is proposed. And feasible suggestions for the later site selection of subsidized housing are also provided. It hopes to avoid or mitigate the impact of "spatial mismatch," promote the "spatial adaptation" of "jobs-housing," and truly improve the overall welfare level of affordable housing residents.Keywords: affordable housing, spatial mismatch, commuting characteristics, spatial adaptation, welfare benefits
Procedia PDF Downloads 1151391 PolyScan: Comprehending Human Polymicrobial Infections for Vector-Borne Disease Diagnostic Purposes
Authors: Kunal Garg, Louise Theusen Hermansan, Kanoktip Puttaraska, Oliver Hendricks, Heidi Pirttinen, Leona Gilbert
Abstract:
The Germ Theory (one infectious determinant is equal to one disease) has unarguably evolved our capability to diagnose and treat infectious diseases over the years. Nevertheless, the advent of technology, climate change, and volatile human behavior has brought about drastic changes in our environment, leading us to question the relevance of the Germ Theory in our day, i.e. will vector-borne disease (VBD) sufferers produce multiple immune responses when tested for multiple microbes? Vector diseased patients producing multiple immune responses to different microbes would evidently suggest human polymicrobial infections (HPI). Ongoing diagnostic tools are exceedingly unequipped with the current research findings that would aid in diagnosing patients for polymicrobial infections. This shortcoming has caused misdiagnosis at very high rates, consequently diminishing the patient’s quality of life due to inadequate treatment. Equipped with the state-of-art scientific knowledge, PolyScan intends to address the pitfalls in current VBD diagnostics. PolyScan is a multiplex and multifunctional enzyme linked Immunosorbent assay (ELISA) platform that can test for numerous VBD microbes and allow simultaneous screening for multiple types of antibodies. To validate PolyScan, Lyme Borreliosis (LB) and spondyloarthritis (SpA) patient groups (n = 54 each) were tested for Borrelia burgdorferi, Borrelia burgdorferi Round Body (RB), Borrelia afzelii, Borrelia garinii, and Ehrlichia chaffeensis against IgM and IgG antibodies. LB serum samples were obtained from Germany and SpA serum samples were obtained from Denmark under relevant ethical approvals. The SpA group represented chronic LB stage because reactive arthritis (SpA subtype) in the form of Lyme arthritis links to LB. It was hypothesized that patients from both the groups will produce multiple immune responses that as a consequence would evidently suggest HPI. It was also hypothesized that the multiple immune response proportion in SpA patient group would be significantly larger when compared to the LB patient group across both antibodies. It was observed that 26% LB patients and 57% SpA patients produced multiple immune responses in contrast to 33% LB patients and 30% SpA patients that produced solitary immune responses when tested against IgM. Similarly, 52% LB patients and an astounding 73% SpA patients produced multiple immune responses in contrast to 30% LB patients and 8% SpA patients that produced solitary immune responses when tested against IgG. Interestingly, IgM immune dysfunction in both the patient groups was also recorded. Atypically, 6% of the unresponsive 18% LB with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Similarly, 12% of the unresponsive 19% SpA with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Thus, results not only supported hypothesis but also suggested that IgM may atypically prevail longer than IgG. The PolyScan concept will aid clinicians to detect patients for early, persistent, late, polymicrobial, & immune dysfunction conditions linked to different VBD. PolyScan provides a paradigm shift for the VBD diagnostic industry to follow that will drastically shorten patient’s time to receive adequate treatment.Keywords: diagnostics, immune dysfunction, polymicrobial, TICK-TAG
Procedia PDF Downloads 3371390 Impact of Climate Change on Crop Production: Climate Resilient Agriculture Is the Need of the Hour
Authors: Deepak Loura
Abstract:
Climate change is considered one of the major environmental problems of the 21st century and a lasting change in the statistical distribution of weather patterns over periods ranging from decades to millions of years. Agriculture and climate change are internally correlated with each other in various aspects, as the threat of varying global climate has greatly driven the attention of scientists, as these variations are imparting a negative impact on global crop production and compromising food security worldwide. The fast pace of development and industrialization and indiscriminate destruction of the natural environment, more so in the last century, have altered the concentration of atmospheric gases that lead to global warming. Carbon dioxide (CO₂), methane (CH₄), and nitrous oxide (NO) are important biogenic greenhouse gases (GHGs) from the agricultural sector contributing to global warming and their concentration is increasing alarmingly. Agricultural productivity can be affected by climate change in 2 ways: first, directly, by affecting plant growth development and yield due to changes in rainfall/precipitation and temperature and/or CO₂ levels, and second, indirectly, there may be considerable impact on agricultural land use due to snow melt, availability of irrigation, frequency and intensity of inter- and intra-seasonal droughts and floods, soil organic matter transformations, soil erosion, distribution and frequency of infestation by insect pests, diseases or weeds, the decline in arable areas (due to submergence of coastal lands), and availability of energy. An increase in atmospheric CO₂ promotes the growth and productivity of C3 plants. On the other hand, an increase in temperature, can reduce crop duration, increase crop respiration rates, affect the equilibrium between crops and pests, hasten nutrient mineralization in soils, decrease fertilizer- use efficiencies, and increase evapotranspiration among others. All these could considerably affect crop yield in long run. Climate resilient agriculture consisting of adaptation, mitigation, and other agriculture practices can potentially enhance the capacity of the system to withstand climate-related disturbances by resisting damage and recovering quickly. Climate resilient agriculture turns the climate change threats that have to be tackled into new business opportunities for the sector in different regions and therefore provides a triple win: mitigation, adaptation, and economic growth. Improving the soil organic carbon stock of soil is integral to any strategy towards adapting to and mitigating the abrupt climate change, advancing food security, and improving the environment. Soil carbon sequestration is one of the major mitigation strategies to achieve climate-resilient agriculture. Climate-smart agriculture is the only way to lower the negative impact of climate variations on crop adaptation before it might affect global crop production drastically. To cope with these extreme changes, future development needs to make adjustments in technology, management practices, and legislation. Adaptation and mitigation are twin approaches to bringing resilience to climate change in agriculture.Keywords: climate change, global warming, crop production, climate resilient agriculture
Procedia PDF Downloads 781389 Impact of Land-Use and Climate Change on the Population Structure and Distribution Range of the Rare and Endangered Dracaena ombet and Dobera glabra in Northern Ethiopia
Authors: Emiru Birhane, Tesfay Gidey, Haftu Abrha, Abrha Brhan, Amanuel Zenebe, Girmay Gebresamuel, Florent Noulèkoun
Abstract:
Dracaena ombet and Dobera glabra are two of the most rare and endangered tree species in dryland areas. Unfortunately, their sustainability is being compromised by different anthropogenic and natural factors. However, the impacts of ongoing land use and climate change on the population structure and distribution of the species are less explored. This study was carried out in the grazing lands and hillside areas of the Desa'a dry Afromontane forest, northern Ethiopia, to characterize the population structure of the species and predict the impact of climate change on their potential distributions. In each land-use type, abundance, diameter at breast height, and height of the trees were collected using 70 sampling plots distributed over seven transects spaced one km apart. The geographic coordinates of each individual tree were also recorded. The results showed that the species populations were characterized by low abundance and unstable population structure. The latter was evinced by a lack of seedlings and mature trees. The study also revealed that the total abundance and dendrometric traits of the trees were significantly different between the two land uses. The hillside areas had a denser abundance of bigger and taller trees than the grazing lands. Climate change predictions using the MaxEnt model highlighted that future temperature increases coupled with reduced precipitation would lead to significant reductions in the suitable habitats of the species in northern Ethiopia. The species' suitable habitats were predicted to decline by 48–83% for D. ombet and 35–87% for D. glabra. Hence, to sustain the species populations, different strategies should be adopted, namely the introduction of alternative livelihoods (e.g., gathering NTFP) to reduce the overexploitation of the species for subsistence income and the protection of the current habitats that will remain suitable in the future using community-based exclosures. Additionally, the preservation of the species' seeds in gene banks is crucial to ensure their long-term conservation.Keywords: grazing lands, hillside areas, land-use change, MaxEnt, range limitation, rare and endangered tree species
Procedia PDF Downloads 1021388 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 1211387 The Role of Accounting and Auditing in Anti-Corruption Strategies: The Case of ECOWAS
Authors: Edna Gnomblerou
Abstract:
Given the current scale of corruption epidemic in West African economies, governments are seeking for immediate and effective measures to reduce the likelihood of the plague within the region. Generally, accountants and auditors are expected to help organizations in detecting illegal practices. However, their role in the fight against corruption is sometimes limited due to the collusive nature of corruption. The Denmark anti-corruption model shows that the implementation of additional controls over public accounts and independent efficient audits improve transparency and increase the probability of detection. This study is aimed at reviewing the existing anti-corruption policies of the Economic Commission of West African States (ECOWAS) as to observe the role attributed to accounting, auditing and other managerial practices in their anti-corruption drive. It further discusses the usefulness of accounting and auditing in helping anti-corruption commissions in controlling misconduct and increasing the perception to detect irregularities within public administration. The purpose of this initiative is to identify and assess the relevance of accounting and auditing in curbing corruption. To meet this purpose, the study was designed to answer the questions of whether accounting and auditing processes were included in the reviewed anti-corruption strategies, and if yes, whether they were effective in the detection process. A descriptive research method was adopted in examining the role of accounting and auditing in West African anti-corruption strategies. The analysis reveals that proper recognition of accounting standards and implementation of financial audits are viewed as strategic mechanisms in tackling corruption. Additionally, codes of conduct, whistle-blowing and information disclosure to the public are among the most common managerial practices used throughout anti-corruption policies to effectively and efficiently address the problem. These observations imply that sound anti-corruption strategies cannot ignore the values of including accounting and auditing processes. On one hand, this suggests that governments should employ all resources possible to improve accounting and auditing practices in the management of public sector organizations. On the other hand, governments must ensure that accounting and auditing practices are not limited to the private sector, but when properly implemented constitute crucial mechanisms to control and reduce corrupt incentives in public sector.Keywords: accounting, anti-corruption strategy, auditing, ECOWAS
Procedia PDF Downloads 2591386 Seek First to Regulate, Then to Understand: The Case for Preemptive Regulation of Robots
Authors: Catherine McWhorter
Abstract:
Robotics is a fast-evolving field lacking comprehensive and harm-mitigating regulation; it also lacks critical data on how human-robot interaction (HRI) may affect human psychology. As most anthropomorphic robots are intended as substitutes for humans, this paper asserts that the commercial robotics industry should be preemptively regulated at the federal level such that robots capable of embodying a victim role in criminal scenarios (“vicbots”) are prohibited until clinical studies determine their effects on the user and society. The results of these studies should then inform more permanent legislation that strives to mitigate risks of harm without infringing upon fundamental rights or stifling innovation. This paper explores these concepts through the lens of the sex robot industry. The sexbot industry offers some of the most realistic, interactive, and customizable robots for sale today. From approximately 2010 until 2017, some sex robot producers, such as True Companion, actively promoted ‘vicbot’ culture with personalities like “Frigid Farrah” and “Young Yoko” but received significant public backlash for fetishizing rape and pedophilia. Today, “Frigid Farrah” and “Young Yoko” appear to have vanished. Sexbot producers have replaced preprogrammed vicbot personalities in favor of one generic, customizable personality. According to the manufacturer ainidoll.com, when asked, there is only one thing the user won’t be able to program the sexbot to do – “…give you drama”. The ability to customize vicbot personas is possible with today’s generic personality sexbots and may undermine the intent of some current legislative efforts. Current debate on the effects of vicbots indicates a lack of consensus. Some scholars suggest vicbots may reduce the rate of actual sex crimes, and some suggest that vicbots will, in fact, create sex criminals, while others cite their potential for rehabilitation. Vicbots may have value in some instances when prescribed by medical professionals, but the overall uncertainty and lack of data further underscore the need for preemptive regulation and clinical research. Existing literature on exposure to media violence and its effects on prosocial behavior, human aggression, and addiction may serve as launch points for specific studies into the hyperrealism of vicbots. Of course, the customization, anthropomorphism and artificial intelligence of sexbots, and therefore more mainstream robots, will continue to evolve. The existing sexbot industry offers an opportunity to preemptively regulate and to research answers to these and many more questions before this type of technology becomes even more advanced and mainstream. Robots pose complicated moral, ethical, and legal challenges, most of which are beyond the scope of this paper. By examining the possibility for custom vicbots via the sexbots industry, reviewing existing literature on regulation, media violence, and vicbot user effects, this paper strives to underscore the need for preemptive federal regulation prohibiting vicbot capabilities in robots while advocating for further research into the potential for the user and societal harm by the same.Keywords: human-robot interaction effects, regulation, research, robots
Procedia PDF Downloads 2091385 The Effects of the GAA15 (Gaelic Athletic Association 15) on Lower Extremity Injury Incidence and Neuromuscular Functional Outcomes in Collegiate Gaelic Games: A 2 Year Prospective Study
Authors: Brenagh E. Schlingermann, Clare Lodge, Paula Rankin
Abstract:
Background: Gaelic football, hurling and camogie are highly popular field games in Ireland. Research into the epidemiology of injury in Gaelic games revealed that approximately three quarters of the injuries in the games occur in the lower extremity. These injuries can have player, team and institutional impacts due to multiple factors including financial burden and time loss from competition. Research has shown it is possible to record injury data consistently with the GAA through a closed online recording system known as the GAA injury surveillance database. It has been established that determining the incidence of injury is the first step of injury prevention. The goals of this study were to create a dynamic GAA15 injury prevention programme which addressed five key components/goals; avoid positions associated with a high risk of injury, enhance flexibility, enhance strength, optimize plyometrics and address sports specific agilities. These key components are internationally recognized through the Prevent Injury, Enhance performance (PEP) programme which has proven reductions in ACL injuries by 74%. In national Gaelic games the programme is known as the GAA15 which has been devised from the principles of the PEP. No such injury prevention strategies have been published on this cohort in Gaelic games to date. This study will investigate the effects of the GAA15 on injury incidence and neuromuscular function in Gaelic games. Methods: A total of 154 players (mean age 20.32 ± 2.84) were recruited from the GAA teams within the Institute of Technology Carlow (ITC). Preseason and post season testing involved two objective screening tests; Y balance test and Three Hop Test. Practical workshops, with ongoing liaison, were provided to the coaches on the implementation of the GAA15. The programme was performed before every training session and game and the existing GAA injury surveillance database was accessed to monitor player’s injuries by the college sports rehabilitation athletic therapist. Retrospective analysis of the ITC clinic records were performed in conjunction with the database analysis as a means of tracking injuries that may have been missed. The effects of the programme were analysed by comparing the intervention groups Y balance and three hop test scores to an age/gender matched control group. Results: Year 1 results revealed significant increases in neuromuscular function as a result of the GAA15. Y Balance test scores for the intervention group increased in both the posterolateral (p=.005 and p=.001) and posteromedial reach directions (p= .001 and p=.001). A decrease in performance was determined for the three hop test (p=.039). Overall twenty-five injuries were reported during the season resulting in an injury rate of 3.00 injuries/1000hrs of participation; 1.25 injuries/1000hrs training and 4.25 injuries/1000hrs match play. Non-contact injuries accounted for 40% of the injuries sustained. Year 2 results are pending and expected April 2016. Conclusion: It is envisaged that implementation of the GAA15 will continue to reduce the risk of injury and improve neuromuscular function in collegiate Gaelic games athletes.Keywords: GAA15, Gaelic games, injury prevention, neuromuscular training
Procedia PDF Downloads 3411384 Effect of Natural and Urban Environments on the Perception of Thermal Pain – Experimental Research Using Virtual Environments
Authors: Anna Mucha, Ewa Wojtyna, Anita Pollak
Abstract:
The environment in which an individual resides and observes may play a meaningful role in well-being and related constructs. Contact with nature may have a positive influence of natural environments on individuals, impacting mood and psychophysical sensations, such as pain relief. Conversely, urban settings, dominated by concrete elements, might lead to mood decline and heightened stress levels. Similarly, the situation may appear in the case of the perception of virtual environments. However, this is a topic that requires further exploration, especially in the context of relationships with pain. The aforementioned matters served as the basis for formulating and executing the outlined experimental research within the realm of environmental psychology, leveraging new technologies, notably virtual reality (VR), which is progressively gaining prominence in the domain of mental health. The primary objective was to investigate the impact of a simulated virtual environment, mirroring a natural setting abundant in greenery, on the perception of acute pain induced by thermal stimuli (high temperature) – encompassing intensity, unpleasantness, and pain tolerance. Comparative analyses were conducted between the virtual natural environment (intentionally constructed in the likeness of a therapeutic garden), virtual urban environment, and a control group devoid of virtual projections. Secondary objectives aimed to determine the mutual relationships among variables such as positive and negative emotions, preferences regarding virtual environments, sense of presence, and restorative experience in the context of the perception of presented virtual environments and induced thermal pain. The study encompassed 126 physically healthy Polish adults, distributing 42 individuals across each of the three comparative groups. Oculus Rift VR technology and the TSA-II neurosensory analyzer facilitated the experiment. Alongside demographic data, participants' subjective feelings concerning virtual reality and pain were evaluated using the Visual Analogue Scale (VAS), the original Restorative Experience in the Virtual World questionnaire (Doświadczenie Regeneracji w Wirtualnym Świecie), and an adapted Slater-Usoh-Steed (SUS) questionnaire. Results of statistical and psychometric analyses, such as Kruskal-Wallis tests, Wilcoxon tests, and contrast analyses, underscored the positive impact of the virtual natural environment on individual pain perception and mood. The virtual natural environment outperformed the virtual urban environment and the control group without virtual projection, particularly in subjective pain components like intensity and unpleasantness. Variables such as restorative experience, sense of presence and virtual environment preference also proved pivotal in pain perception and pain tolerance threshold alterations, contingent on specific conditions. This implies considerable application potential for virtual natural environments across diverse realms of psychology and related fields, among others as a supportive analgesic approach and a form of relaxation following psychotherapeutic sessions.Keywords: environmental psychology, nature, acute pain, emotions, vitrual reality, virtual environments
Procedia PDF Downloads 671383 The Influence of Human Movement on the Formation of Adaptive Architecture
Authors: Rania Raouf Sedky
Abstract:
Adaptive architecture relates to buildings specifically designed to adapt to their residents and their environments. To design a biologically adaptive system, we can observe how living creatures in nature constantly adapt to different external and internal stimuli to be a great inspiration. The issue is not just how to create a system that is capable of change but also how to find the quality of change and determine the incentive to adapt. The research examines the possibilities of transforming spaces using the human body as an active tool. The research also aims to design and build an effective dynamic structural system that can be applied on an architectural scale and integrate them all into the creation of a new adaptive system that allows us to conceive a new way to design, build and experience architecture in a dynamic manner. The main objective was to address the possibility of a reciprocal transformation between the user and the architectural element so that the architecture can adapt to the user, as the user adapts to architecture. The motivation is the desire to deal with the psychological benefits of an environment that can respond and thus empathize with human emotions through its ability to adapt to the user. Adaptive affiliations of kinematic structures have been discussed in architectural research for more than a decade, and these issues have proven their effectiveness in developing kinematic structures, responsive and adaptive, and their contribution to 'smart architecture'. A wide range of strategies have been used in building complex kinetic and robotic systems mechanisms to achieve convertibility and adaptability in engineering and architecture. One of the main contributions of this research is to explore how the physical environment can change its shape to accommodate different spatial displays based on the movement of the user’s body. The main focus is on the relationship between materials, shape, and interactive control systems. The intention is to develop a scenario where the user can move, and the structure interacts without any physical contact. The soft form of shifting language and interaction control technology will provide new possibilities for enriching human-environmental interactions. How can we imagine a space in which to construct and understand its users through physical gestures, visual expressions, and response accordingly? How can we imagine a space whose interaction depends not only on preprogrammed operations but on real-time feedback from its users? The research also raises some important questions for the future. What would be the appropriate structure to show physical interaction with the dynamic world? This study concludes with a strong belief in the future of responsive motor structures. We imagine that they are developing the current structure and that they will radically change the way spaces are tested. These structures have obvious advantages in terms of energy performance and the ability to adapt to the needs of users. The research highlights the interface between remote sensing and a responsive environment to explore the possibility of an interactive architecture that adapts to and responds to user movements. This study ends with a strong belief in the future of responsive motor structures. We envision that it will improve the current structure and that it will bring a fundamental change to the way in which spaces are tested.Keywords: adaptive architecture, interactive architecture, responsive architecture, tensegrity
Procedia PDF Downloads 1641382 Utilization of Silk Waste as Fishmeal Replacement: Growth Performance of Cyprinus carpio Juveniles Fed with Bombyx mori Pupae
Authors: Goksen Capar, Levent Dogankaya
Abstract:
According to the circular economy model, resource productivity should be maximized and wastes should be reduced. Since earth’s natural resources are continuously depleted, resource recovery has gained great interest in recent years. As part of our research study on the recovery and reuse of silk wastes, this paper focuses on the utilization of silkworm pupae as fishmeal replacement, which would replace the original fishmeal raw material, namely the fish itself. This, in turn, would contribute to sustainable management of wild fish resources. Silk fibre is secreted by the silkworm Bombyx mori in order to construct a 'room' for itself during its transformation process from pupae to an adult moth. When the cocoons are boiled in hot water, silk fibre becomes loose and the silk yarn is produced by combining thin silk fibres. The remaining wastes are 1) sericin protein, which is dissolved in water, 2) remaining part of cocoon, including the dead body of B. mori pupae. In this study, an eight weeks trial was carried out to determine the growth performance of common carp juveniles fed with waste silkworm pupae meal (SWPM) as a replacement for fishmeal (FM). Four isonitrogenous diets (40% CP) were prepared replacing 0%, 33%, 50%, and 100% of the dietary FM with non-defatted silkworm pupae meal as a dietary protein source for experiments in C. carpio. Triplicate groups comprising of 20 fish (0.92±0.29 g) were fed twice/day with one of the four diets. Over a period of 8 weeks, results showed that the diet containing 50% of its protein from SWPM had significantly higher (p ≤ 0.05) growth rates in all groups. The increasing levels of SWPM were resulted in a decrease in growth performance and significantly lower growth (p ≤ 0.05) was observed with diets having 100% SWPM. The study demonstrates that it is practical to replace 50% of the FM protein with SWPM with a significantly better utilization of the diet but higher SWPM levels are not recommended for juvenile carp. Further experiments are under study to have more detailed results on the possible effects of this alternative diet on the growth performance of juvenile carp.Keywords: Bombyx mori, Cyprinus carpio, fish meal, silk, waste pupae
Procedia PDF Downloads 1611381 Braille Code Matrix
Authors: Mohammed E. A. Brixi Nigassa, Nassima Labdelli, Ahmed Slami, Arnaud Pothier, Sofiane Soulimane
Abstract:
According to the world health organization (WHO), there are almost 285 million people with visual disability, 39 million of these people are blind. Nevertheless, there is a code for these people that make their life easier and allow them to access information more easily; this code is the Braille code. There are several commercial devices allowing braille reading, unfortunately, most of these devices are not ergonomic and too expensive. Moreover, we know that 90 % of blind people in the world live in low-incomes countries. Our contribution aim is to concept an original microactuator for Braille reading, as well as being ergonomic, inexpensive and lowest possible energy consumption. Nowadays, the piezoelectric device gives the better actuation for low actuation voltage. In this study, we focus on piezoelectric (PZT) material which can bring together all these conditions. Here, we propose to use one matrix composed by six actuators to form the 63 basic combinations of the Braille code that contain letters, numbers, and special characters in compliance with the standards of the braille code. In this work, we use a finite element model with Comsol Multiphysics software for designing and modeling this type of miniature actuator in order to integrate it into a test device. To define the geometry and the design of our actuator, we used physiological limits of perception of human being. Our results demonstrate in our study that piezoelectric actuator could bring a large deflection out-of-plain. Also, we show that microactuators can exhibit non uniform compression. This deformation depends on thin film thickness and the design of membrane arm. The actuator composed of four arms gives the higher deflexion and it always gives a domed deformation at the center of the deviceas in case of the Braille system. The maximal deflection can be estimated around ten micron per Volt (~ 10µm/V). We noticed that the deflection according to the voltage is a linear function, and this deflection not depends only on the voltage the voltage, but also depends on the thickness of the film used and the design of the anchoring arm. Then, we were able to simulate the behavior of the entire matrix and thus display different characters in Braille code. We used these simulations results to achieve our demonstrator. This demonstrator is composed of a layer of PDMS on which we put our piezoelectric material, and then added another layer of PDMS to isolate our actuator. In this contribution, we compare our results to optimize the final demonstrator.Keywords: Braille code, comsol software, microactuators, piezoelectric
Procedia PDF Downloads 3601380 Inappropriate Prescribing Defined by START and STOPP Criteria and Its Association with Adverse Drug Events among Older Hospitalized Patients
Authors: Mohd Taufiq bin Azmy, Yahaya Hassan, Shubashini Gnanasan, Loganathan Fahrni
Abstract:
Inappropriate prescribing in older patients has been associated with resource utilization and adverse drug events (ADE) such as hospitalization, morbidity and mortality. Globally, there is a lack of published data on ADE induced by inappropriate prescribing. Our study is specific to an older population and is aimed at identifying risk factors for ADE and to develop a model that will link ADE to inappropriate prescribing. The design of the study was prospective whereby computerized medical records of 302 hospitalized elderly aged 65 years and above in 3 public hospitals in Malaysia (Hospital Serdang, Hospital Selayang and Hospital Sungai Buloh) were studied over a 7 month period from September 2013 until March 2014. Potentially inappropriate medications and potential prescribing omissions were determined using the published and validated START-STOPP criteria. Patients who had at least one inappropriate medication were included in Phase II of the study where ADE were identified by local expert consensus panel based on the published and validated Naranjo ADR probability scale. The panel also assessed whether ADE were causal or contributory to current hospitalization. The association between inappropriate prescribing and ADE (hospitalization, mortality and adverse drug reactions) was determined by identifying whether or not the former was causal or contributory to the latter. Rate of ADE avoidability was also determined. Our findings revealed that the prevalence of potential inappropriate prescribing was 58.6%. A total of ADEs were detected in 31 of 105 patients (29.5%) when STOPP criteria were used to identify potentially inappropriate medication; All of the 31 ADE (100%) were considered causal or contributory to admission. Of the 31 ADEs, 28 (90.3%) were considered avoidable or potentially avoidable. After adjusting for age, sex, comorbidity, dementia, baseline activities of daily living function, and number of medications, the likelihood of a serious avoidable ADE increased significantly when a potentially inappropriate medication was prescribed (odds ratio, 11.18; 95% confidence interval [CI], 5.014 - 24.93; p < .001). The medications identified by STOPP criteria, are significantly associated with avoidable ADE in older people that cause or contribute to urgent hospitalization but contributed less towards morbidity and mortality. Findings of the study underscore the importance of preventing inappropriate prescribing.Keywords: adverse drug events, appropriate prescribing, health services research
Procedia PDF Downloads 4031379 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry
Authors: Nadia Belu, Laurenţiu Mihai Ionescu, Agnieszka Misztal
Abstract:
The automotive industry is one of the most important industries in the world that concerns not only the economy, but also the world culture. In the present financial and economic context, this field faces new challenges posed by the current crisis, companies must maintain product quality, deliver on time and at a competitive price in order to achieve customer satisfaction. Two of the most recommended techniques of quality management by specific standards of the automotive industry, in the product development, are Failure Mode and Effects Analysis (FMEA) and Control Plan. FMEA is a methodology for risk management and quality improvement aimed at identifying potential causes of failure of products and processes, their quantification by risk assessment, ranking of the problems identified according to their importance, to the determination and implementation of corrective actions related. The companies use Control Plans realized using the results from FMEA to evaluate a process or product for strengths and weaknesses and to prevent problems before they occur. The Control Plans represent written descriptions of the systems used to control and minimize product and process variation. In addition Control Plans specify the process monitoring and control methods (for example Special Controls) used to control Special Characteristics. In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.Keywords: automotive industry, FMEA, control plan, automotive technology
Procedia PDF Downloads 4081378 Designing Self-Healing Lubricant-Impregnated Surfaces for Corrosion Protection
Authors: Sami Khan, Kripa Varanasi
Abstract:
Corrosion is a widespread problem in several industries and developing surfaces that resist corrosion has been an area of interest since the last several decades. Superhydrophobic surfaces that combine hydrophobic coatings along with surface texture have been shown to improve corrosion resistance by creating voids filled with air that minimize the contact area between the corrosive liquid and the solid surface. However, these air voids can incorporate corrosive liquids over time, and any mechanical faults such as cracks can compromise the coating and provide pathways for corrosion. As such, there is a need for self-healing corrosion-resistance surfaces. In this work, the anti-corrosion properties of textured surfaces impregnated with a lubricant have been systematically studied. Since corrosion resistance depends on the area and physico-chemical properties of the material exposed to the corrosive medium, lubricant-impregnated surfaces (LIS) have been designed based on the surface tension, viscosity and chemistry of the lubricant and its spreading coefficient on the solid. All corrosion experiments were performed in a standard three-electrode cell using iron, which readily corrodes in a 3.5% sodium chloride solution. In order to obtain textured iron surfaces, thin films (~500 nm) of iron were sputter-coated on silicon wafers textured using photolithography, and subsequently impregnated with lubricants. Results show that the corrosion rate on LIS is greatly reduced, and offers an over hundred-fold improvement in corrosion protection. Furthermore, it is found that the spreading characteristics of the lubricant are significant in ensuring corrosion protection: a spreading lubricant (e.g., Krytox 1506) that covers both inside the texture, as well as the top of the texture, provides a two-fold improvement in corrosion protection as compared to a non-spreading lubricant (e.g., Silicone oil) that does not cover texture tops. To enhance corrosion protection of surfaces coated with a non-spreading lubricant, pyramid-shaped textures have been developed that minimize exposure to the corrosive solution, and a consequent twenty-fold increased in corrosion protection is observed. An increase in viscosity of the lubricant scales with greater corrosion protection. Finally, an equivalent cell-circuit model is developed for the lubricant-impregnated systems using electrochemical impedance spectroscopy. Lubricant-impregnated surfaces find attractive applications in harsh corrosive environments, especially where the ability to self-heal is advantageous.Keywords: lubricant-impregnated surfaces, self-healing surfaces, wettability, nano-engineered surfaces
Procedia PDF Downloads 1381377 Early Age Behavior of Wind Turbine Gravity Foundations
Authors: Janet Modu, Jean-Francois Georgin, Laurent Briancon, Eric Antoinet
Abstract:
The current practice during the repowering phase of wind turbines is deconstruction of existing foundations and construction of new foundations to accept larger wind loads or once the foundations have reached the end of their service lives. The ongoing research project FUI25 FEDRE (Fondations d’Eoliennes Durables et REpowering) therefore serves to propose scalable wind turbine foundation designs to allow reuse of the existing foundations. To undertake this research, numerical models and laboratory-scale models are currently being utilized and implemented in the GEOMAS laboratory at INSA Lyon following instrumentation of a reference wind turbine situated in the Northern part of France. Sensors placed within both the foundation and the underlying soil monitor the evolution of stresses from the foundation’s early age to stresses during service. The results from the instrumentation form the basis of validation for both the laboratory and numerical works conducted throughout the project duration. The study currently focuses on the effect of coupled mechanisms (Thermal-Hydro-Mechanical-Chemical) that induce stress during the early age of the reinforced concrete foundation, and scale factor considerations in the replication of the reference wind turbine foundation at laboratory-scale. Using THMC 3D models on COMSOL Multi-physics software, the numerical analysis performed on both the laboratory-scale and the full-scale foundations simulate the thermal deformation, hydration, shrinkage (desiccation and autogenous) and creep so as to predict the initial damage caused by internal processes during concrete setting and hardening. Results show a prominent effect of early age properties on the damage potential in full-scale wind turbine foundations. However, a prediction of the damage potential at laboratory scale shows significant differences in early age stresses in comparison to the full-scale model depending on the spatial position in the foundation. In addition to the well-known size effect phenomenon, these differences may contribute to inaccuracies encountered when predicting ultimate deformations of the on-site foundation using laboratory scale models.Keywords: cement hydration, early age behavior, reinforced concrete, shrinkage, THMC 3D models, wind turbines
Procedia PDF Downloads 1791376 Observation of Inverse Blech Length Effect during Electromigration of Cu Thin Film
Authors: Nalla Somaiah, Praveen Kumar
Abstract:
Scaling of transistors and, hence, interconnects is very important for the enhanced performance of microelectronic devices. Scaling of devices creates significant complexity, especially in the multilevel interconnect architectures, wherein current crowding occurs at the corners of interconnects. Such a current crowding creates hot-spots at the respective corners, resulting in non-uniform temperature distribution in the interconnect as well. This non-uniform temperature distribution, which is exuberated with continued scaling of devices, creates a temperature gradient in the interconnect. In particular, the increased current density at corners and the associated temperature rise due to Joule heating accelerate the electromigration induced failures in interconnects, especially at corners. This has been the classic reliability issue associated with metallic interconnects. Herein, it is generally understood that electromigration induced damages can be avoided if the length of interconnect is smaller than a critical length, often termed as Blech length. Interestingly, the effect of non-negligible temperature gradients generated at these corners in terms of thermomigration and electromigration-thermomigration coupling has not attracted enough attention. Accordingly, in this work, the interplay between the electromigration and temperature gradient induced mass transport was studied using standard Blech structure. In this particular sample structure, the majority of the current is forcefully directed into the low resistivity metallic film from a high resistivity underlayer film, resulting in current crowding at the edges of the metallic film. In this study, 150 nm thick Cu metallic film was deposited on 30 nm thick W underlayer film in the configuration of Blech structure. Series of Cu thin strips, with lengths of 10, 20, 50, 100, 150 and 200 μm, were fabricated. Current density of ≈ 4 × 1010 A/m² was passed through Cu and W films at a temperature of 250ºC. Herein, along with expected forward migration of Cu atoms from the cathode to the anode at the cathode end of the Cu film, backward migration from the anode towards the center of Cu film was also observed. Interestingly, smaller length samples consistently showed enhanced migration at the cathode end, thus indicating the existence of inverse Blech length effect in presence of temperature gradient. A finite element based model showing the interplay between electromigration and thermomigration driving forces has been developed to explain this observation.Keywords: Blech structure, electromigration, temperature gradient, thin films
Procedia PDF Downloads 2601375 Synthesis and Preparation of Carbon Ferromagnetic Nanocontainers for Cancer Therapy
Authors: L. Szymanski, Z. Kolacinski, Z. Kamiński, G. Raniszewski, J. Fraczyk, L. Pietrzak
Abstract:
In the article the development and demonstration of method and the model device for hyperthermic selective destruction of cancer cells are presented. This method was based on the synthesis and functionalization of carbon nanotubes serving as ferromagnetic material nano containers. Methodology of the production carbon - ferromagnetic nanocontainers includes: the synthesis of carbon nanotubes, chemical and physical characterization, increasing the content of ferromagnetic material and biochemical functionalization involving the attachment of the key addresses. Biochemical functionalization of ferromagnetic nanocontainers is necessary in order to increase the binding selectively with receptors presented on the surface of tumour cells. Multi-step modification procedure was finally used to attach folic acid on the surface of ferromagnetic nanocontainers. Folic acid is ligand of folate receptors which is overexpresion in tumor cells. The presence of ligand should ensure the specificity of the interaction between ferromagnetic nanocontainers and tumor cells. The chemical functionalization contains several step: oxidation reaction, transformation of carboxyl groups into more reactive ester or amide groups, incorporation of spacer molecule (linker), attaching folic acid. Activation of carboxylic groups was prepared with triazine coupling reagent (preparation of superactive ester attached on the nanocontainers). The spacer molecules were designed and synthesized. In order to ensure biocompatibillity of linkers they were built from amino acids or peptides. Spacer molecules were synthesized using the SPPS method. Synthesis was performed on 2-Chlorotrityl resin. The linker important feature is its length. Due to that fact synthesis of peptide linkers containing from 2 to 4 -Ala- residues was carried out. Independent synthesis of the conjugate of foilic acid with 6-aminocaproic acid was made. Final step of synthesis was connecting conjugat with spacer molecules and attaching it on the ferromagnetic nanocontainer surface. This article contains also information about special CVD and microvave plasma system to produce nanotubes and ferromagnetic nanocontainers. The first tests in the device for hyperthermal RF generator will be presented. The frequency of RF generator was in the ranges from 10 to 14Mhz and from 265 to 621kHz.Keywords: synthesis of carbon nanotubes, hyperthermia, ligands, carbon nanotubes
Procedia PDF Downloads 2891374 Operation System for Aluminium-Air Cell: A Strategy to Harvest the Energy from Secondary Aluminium
Authors: Binbin Chen, Dennis Y. C. Leung
Abstract:
Aluminium (Al) -air cell holds a high volumetric capacity density of 8.05 Ah cm-3, benefit from the trivalence of Al ions. Additional benefits of Al-air cell are low price and environmental friendliness. Furthermore, the Al energy conversion process is characterized of 100% recyclability in theory. Along with a large base of raw material reserve, Al attracts considerable attentions as a promising material to be integrated within the global energy system. However, despite the early successful applications in military services, several problems exist that prevent the Al-air cells from widely civilian use. The most serious issue is the parasitic corrosion of Al when contacts with electrolyte. To overcome this problem, super-pure Al alloyed with various traces of metal elements are used to increase the corrosion resistance. Nevertheless, high-purity Al alloys are costly and require high energy consumption during production process. An alternative approach is to add inexpensive inhibitors directly into the electrolyte. However, such additives would increase the internal ohmic resistance and hamper the cell performance. So far these methods have not provided satisfactory solutions for the problem within Al-air cells. For the operation of alkaline Al-air cell, there are still other minor problems. One of them is the formation of aluminium hydroxide in the electrolyte. This process decreases ionic conductivity of electrolyte. Another one is the carbonation process within the gas diffusion layer of cathode, blocking the porosity of gas diffusion. Both these would hinder the performance of cells. The present work optimizes the above problems by building an Al-air cell operation system, consisting of four components. A top electrolyte tank containing fresh electrolyte is located at a high level, so that it can drive the electrolyte flow by gravity force. A mechanical rechargeable Al-air cell is fabricated with low-cost materials including low grade Al, carbon paper, and PMMA plates. An electrolyte waste tank with elaborate channel is designed to separate the hydrogen generated from the corrosion, which would be collected by gas collection device. In the first section of the research work, we investigated the performance of the mechanical rechargeable Al-air cell with a constant flow rate of electrolyte, to ensure the repeatability experiments. Then the whole system was assembled together and the feasibility of operating was demonstrated. During experiment, pure hydrogen is collected by collection device, which holds potential for various applications. By collecting this by-product, high utilization efficiency of aluminum is achieved. Considering both electricity and hydrogen generated, an overall utilization efficiency of around 90 % or even higher under different working voltages are achieved. Fluidic electrolyte could remove aluminum hydroxide precipitate and solve the electrolyte deterioration problem. This operation system provides a low-cost strategy for harvesting energy from the abundant secondary Al. The system could also be applied into other metal-air cells and is suitable for emergency power supply, power plant and other applications. The low cost feature implies great potential for commercialization. Further optimization, such as scaling up and optimization of fabrication, will help to refine the technology into practical market offerings.Keywords: aluminium-air cell, high efficiency, hydrogen, mechanical recharge
Procedia PDF Downloads 286