Search results for: large scale maps
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11866

Search results for: large scale maps

11506 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework

Authors: Abbas Raza Ali

Abstract:

Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.

Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation

Procedia PDF Downloads 152
11505 Transforming the Hazelnut Supply Chain: Opportunities and Challenges for Ontario Agri-Businesses

Authors: Kalinga Jagoda

Abstract:

With changing population demographics and consumer preferences, specialty crops present significant opportunities for Ontario agri-businesses to develop niche markets. However, the greater rewards offered by such opportunities come with comparable challenges that are driven by specific productmarket attributes, as well as supply and demand-side factors, including certain risks. Thus, initiatives to promote and support such sectors need to be informed by an understanding of the impact of these product-market and industry specific factors on supply chain development. To this end, this project proposes to map selected specialty crops supply chains, using a suite of tested methodological approaches to evaluate their market potential, considering total supply chain costs, lead times and responsiveness. The project will deliver comprehensive supply chain maps identifying the points of value addition and value capture that are of benefit to key stakeholders for the purposes of developing policy interventions, conducting market appraisals and identifying industry best practices.

Keywords: supply chain management, hazelnut industry, supply chain maps, market opportunity

Procedia PDF Downloads 24
11504 The Challenges of Teaching First Year Accounting with a Lecturer-Student Ratio of 1:1248

Authors: Hanli Joubert

Abstract:

In South Africa, teaching large classes is a reality that lecturers face in most higher institutions. When teaching a large group, literature normally refers to groups of about 50 to 500 students. At the University of the Free State, the first-year accounting group comprises around 1300 students. Apart from extremely large classes, the problem is exacerbated by the diversity of students’ previous schooling in accounting as well as their socio-economic backgrounds. The university scenario is further complicated by a lack of venues, compressed timetables, as well as lack of resources. This study aims to investigate the challenges and effectiveness of teaching a large and diverse group of first-year accounting students by drawing from personal experience, a literature study, interviews with other lecturers as well as students registered for first year accounting. The results reveal that teaching first-year accounting students in a large group is not the ideal situation but that it can be effective if it is managed correctly.

Keywords: diverse backgrounds, large groups, limited resources, first-year accounting students

Procedia PDF Downloads 32
11503 Socio-Economic Effects of Micro-Credit on Small-Scale Poultry Farmers’ Livelihood in Ado Odo-Ota Local Government Area of Ogun State, Nigeria

Authors: E. O. Fakoya, B. G. Abiona, W. O. Oyediran, A. M. Omoare

Abstract:

This study examined the socio-economic effects of micro-credit on small scale poultry farmers’ livelihood in Ado Odo-Ota Local Government area of Ogun State. Purposive sampling method was used to select eighty (80) small scale poultry farmers that benefited in micro credit. Interview guide was used to obtain information on the respondents’ socio-economic characteristic, sources of micro-credit and the effects of micro-credit on their livelihood. The results revealed that most of the respondents (77.50 %) were males while half (40.00%) of the respondents were between the ages of 31-40 years. A high proportion (72.50%) of the respondents had formal education. The major sources of micro credit to small scale poultry farmers were cooperative society (47.50%) and personal savings (20.00%). The findings also revealed that micro-credit had positive effect on the assets and livelihoods of small scale poultry farmers’ livelihood. Results of t-test analysis showed a significant difference between the effects before and after micro-credit on small-scale poultry farmers’ Livelihood at p < 0.05. The study recommends that formal lending institution should be given necessary support by government to enable poultry farmers have access to credit facilities in the study area.

Keywords: micro-credit, effects, livelihood, poultry farmers, socio-economic, small scale

Procedia PDF Downloads 413
11502 Production and Distribution Network Planning Optimization: A Case Study of Large Cement Company

Authors: Lokendra Kumar Devangan, Ajay Mishra

Abstract:

This paper describes the implementation of a large-scale SAS/OR model with significant pre-processing, scenario analysis, and post-processing work done using SAS. A large cement manufacturer with ten geographically distributed manufacturing plants for two variants of cement, around 400 warehouses serving as transshipment points, and several thousand distributor locations generating demand needed to optimize this multi-echelon, multi-modal transport supply chain separately for planning and allocation purposes. For monthly planning as well as daily allocation, the demand is deterministic. Rail and road networks connect any two points in this supply chain, creating tens of thousands of such connections. Constraints include the plant’s production capacity, transportation capacity, and rail wagon batch size constraints. Each demand point has a minimum and maximum for shipments received. Price varies at demand locations due to local factors. A large mixed integer programming model built using proc OPTMODEL decides production at plants, demand fulfilled at each location, and the shipment route to demand locations to maximize the profit contribution. Using base SAS, we did significant pre-processing of data and created inputs for the optimization. Using outputs generated by OPTMODEL and other processing completed using base SAS, we generated several reports that went into their enterprise system and created tables for easy consumption of the optimization results by operations.

Keywords: production planning, mixed integer optimization, network model, network optimization

Procedia PDF Downloads 40
11501 Large-scale GWAS Investigating Genetic Contributions to Queerness Will Decrease Stigma Against LGBTQ+ Communities

Authors: Paul J. McKay

Abstract:

Large-scale genome-wide association studies (GWAS) investigating genetic contributions to sexual orientation and gender identity are largely lacking and may reduce stigma experienced in the LGBTQ+ community by providing an underlying biological explanation for queerness. While there is a growing consensus within the scientific community that genetic makeup contributes – at least in part – to sexual orientation and gender identity, there is a marked lack of genomics research exploring polygenic contributions to queerness. Based on recent (2019) findings from a large-scale GWAS investigating the genetic architecture of same-sex sexual behavior, and various additional peer-reviewed publications detailing novel insights into the molecular mechanisms of sexual orientation and gender identity, we hypothesize that sexual orientation and gender identity are complex, multifactorial, and polygenic; meaning that many genetic factors contribute to these phenomena, and environmental factors play a possible role through epigenetic modulation. In recent years, large-scale GWAS studies have been paramount to our modern understanding of many other complex human traits, such as in the case of autism spectrum disorder (ASD). Despite possible benefits of such research, including reduced stigma towards queer people, improved outcomes for LGBTQ+ in familial, socio-cultural, and political contexts, and improved access to healthcare (particularly for trans populations); important risks and considerations remain surrounding this type of research. To mitigate possibilities such as invalidation of the queer identities of existing LGBTQ+ individuals, genetic discrimination, or the possibility of euthanasia of embryos with a genetic predisposition to queerness (through reproductive technologies like IVF and/or gene-editing in utero), we propose a community-engaged research (CER) framework which emphasizes the privacy and confidentiality of research participants. Importantly, the historical legacy of scientific research attempting to pathologize queerness (in particular, falsely equating gender variance to mental illness) must be acknowledged to ensure any future research conducted in this realm does not propagate notions of homophobia, transphobia or stigma against queer people. Ultimately, in a world where same-sex sexual activity is criminalized in 69 UN member states, with 67 of these states imposing imprisonment, 8 imposing public flogging, 6 (Brunei, Iran, Mauritania, Nigeria, Saudi Arabia, Yemen) invoking the death penalty, and another 5 (Afghanistan, Pakistan, Qatar, Somalia, United Arab Emirates) possibly invoking the death penalty, the importance of this research cannot be understated, as finding a biological basis for queerness would directly oppose the harmful rhetoric that “being LGBTQ+ is a choice.” Anti-trans legislation is similarly widespread: In the United States in 2022 alone (as of Oct. 13), 155 anti-trans bills have been introduced preventing trans girls and women from playing on female sports teams, barring trans youth from using bathrooms and locker rooms that align with their gender identity, banning access to gender affirming medical care (e.g., hormone-replacement therapy, gender-affirming surgeries), and imposing legal restrictions on name changes. Understanding that a general lack of knowledge about the biological basis of queerness may be a contributing factor to the societal stigma faced by gender and sexual orientation minorities, we propose the initiation of large-scale GWAS studies investigating the genetic basis of gender identity and sexual orientation.

Keywords: genome-wide association studies (GWAS), sexual and gender minorities (SGM), polygenicity, community-engaged research (CER)

Procedia PDF Downloads 50
11500 Scale-Up Process for Phyllanthus niruri Enriched Extract by Supercritical Fluid Extraction

Authors: Norsyamimi Hassim, Masturah Markom

Abstract:

Supercritical fluid extraction (SFE) has been known as a sustainable and safe extraction technique for plant extraction due to the minimal usage of organic solvent. In this study, a scale-up process for the selected herbal plant (Phyllanthus niruri) was investigated by using supercritical carbon dioxide (SC-CO2) with food-grade (ethanol-water) cosolvent. The quantification of excess ethanol content in the final dry extracts was conducted to determine the safety of enriched extracts. The extraction yields obtained by scale-up SFE unit were not much different compared to the predicted extraction yields with an error of 2.92%. For component contents, the scale-up extracts showed comparable quality with laboratory-scale experiments. The final dry extract showed that the excess ethanol content was 1.56% g/g extract. The fish embryo toxicity test (FETT) on the zebrafish embryos showed no toxicity effects by the extract, where the LD50 value was found to be 505.71 µg/mL. Thus, it has been proven that SFE with food-grade cosolvent is a safe extraction technique for the production of bioactive compounds from P. niruri.

Keywords: scale-up, supercritical fluid extraction, enriched extract, toxicity, ethanol content

Procedia PDF Downloads 109
11499 Green Organic Chemistry, a New Paradigm in Pharmaceutical Sciences

Authors: Pesaru Vigneshwar Reddy, Parvathaneni Pavan

Abstract:

Green organic chemistry which is the latest and one of the most researched topics now-a- days has been in demand since 1990’s. Majority of the research in green organic chemistry chemicals are some of the important starting materials for greater number of major chemical industries. The production of organic chemicals has raw materials (or) reagents for other application is major sector of manufacturing polymers, pharmaceuticals, pesticides, paints, artificial fibers, food additives etc. organic synthesis on a large scale compound to the labratory scale, involves the use of energy, basic chemical ingredients from the petro chemical sectors, catalyst and after the end of the reaction, seperation, purification, storage, packing distribution etc. During these processes there are many problems of health and safety for workers in addition to the environmental problems caused there by use and deposition as waste. Green chemistry with its 12 principles would like to see changes in conventional way that were used for decades to make synthetic organic chemical and the use of less toxic starting materials. Green chemistry would like to increase the efficiency of synthetic methods, to use less toxic solvents, reduce the stage of synthetic routes and minimize waste as far as practically possible. In this way, organic synthesis will be part of the effort for sustainable development Green chemistry is also interested for research and alternatives innovations on many practical aspects of organic synthesis in the university and research labaratory of institutions. By changing the methodologies of organic synthesis, health and safety will be advanced in the small scale laboratory level but also will be extended to the industrial large scale production a process through new techniques. The three key developments in green chemistry include the use of super critical carbondioxide as green solvent, aqueous hydrogen peroxide as an oxidising agent and use of hydrogen in asymmetric synthesis. It also focuses on replacing traditional methods of heating with that of modern methods of heating like microwaves traditions, so that carbon foot print should reduces as far as possible. Another beneficiary of this green chemistry is that it will reduce environmental pollution through the use of less toxic reagents, minimizing of waste and more bio-degradable biproducts. In this present paper some of the basic principles, approaches, and early achievements of green chemistry has a branch of chemistry that studies the laws of passing of chemical reactions is also considered, with the summarization of green chemistry principles. A discussion about E-factor, old and new synthesis of ibuprofen, microwave techniques, and some of the recent advancements also considered.

Keywords: energy, e-factor, carbon foot print, micro-wave, sono-chemistry, advancement

Procedia PDF Downloads 267
11498 Base Deficit Profiling in Patients with Isolated Blunt Traumatic Brain Injury – Correlation with Severity and Outcomes

Authors: Shahan Waheed, Muhammad Waqas, Asher Feroz

Abstract:

Objectives: To determine the utility of base deficit in traumatic brain injury in assessing the severity and to correlate with the conventional computed tomography scales in grading the severity of head injury. Methodology: Observational cross-sectional study conducted in a tertiary care facility from 1st January 2010 to 31st December 2012. All patients with isolated traumatic brain injury presenting within 24 hours of the injury to the emergency department were included in the study. Initial Glasgow Coma Scale and base deficit values were taken at presentation, the patients were followed during their hospital stay and CT scan brain findings were recorded and graded as per the Rotterdam scale, the findings were cross-checked by a radiologist, Glasgow Outcome Scale was taken on last follow up. Outcomes were dichotomized into favorable and unfavorable outcomes. Continuous variables with normal and non-normal distributions are reported as mean ± SD. Categorical variables are presented as frequencies and percentages. Relationship of the base deficit with GCS, GOS, CT scan brain and length of stay was calculated using Spearman`s correlation. Results: 154 patients were enrolled in the study. Mean age of the patients were 30 years and 137 were males. The severity of brain injuries as per the GCS was 34 moderate and 109 severe respectively. 34 percent of the total has an unfavorable outcome with a mean of 18±14. The correlation was significant at the 0.01 level with GCS on presentation and the base deficit 0.004. The correlation was not significant between the Rotterdam CT scan brain findings, length of stay and the base deficit. Conclusion: The base deficit was found to be a good predictor of severity of brain injury. There was no association of the severity of injuries on the CT scan brain as per the Rotterdam scale and the base deficit. Further studies with large sample size are needed to further evaluate the associations.

Keywords: base deficit, traumatic brain injury, Rotterdam, GCS

Procedia PDF Downloads 419
11497 Simulation of Flood Inundation in Kedukan River Using HEC-RAS and GIS

Authors: Reini S. Ilmiaty, Muhammad B. Al Amin, Sarino, Muzamil Jariski

Abstract:

Kedukan River is an artificial river which serves as a Watershed Boang drainage channel in Palembang. The river has upstream and downstream connected to Musi River, that often overflowing and flooding caused by the huge runoff discharge and high tide water level of Musi River. This study aimed to analyze the flood water surface profile on Kedukan River continued with flood inundation simulation to determine flooding prone areas in research area. The analysis starts from the peak runoff discharge calculations using rational method followed by water surface profile analysis using HEC-RAS program controlled by manual calculations using standard stages. The analysis followed by running flood inundation simulation using ArcGIS program that has been integrated with HEC-GeoRAS. Flood inundation simulation on Kedukan River creates inundation characteristic maps with depth, area, and circumference of inundation as the parameters. The inundation maps are very useful in providing an overview of flood prone areas in Kedukan River.

Keywords: flood modelling, HEC-GeoRAS, HEC-RAS, inundation map

Procedia PDF Downloads 487
11496 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe

Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis

Abstract:

The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.

Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM

Procedia PDF Downloads 403
11495 Reducing the Risk of Alcohol Relapse after Liver-Transplantation

Authors: Rebeca V. Tholen, Elaine Bundy

Abstract:

Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving an LT. Methods: The HRAR Scale is a predictive tool designed to determine the severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients. (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients, and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving a LT. Methods: The HRAR Scale is a predictive tool designed to determine severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients.

Keywords: alcoholism, liver transplant, quality improvement, substance abuse

Procedia PDF Downloads 96
11494 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster

Authors: Trapti Sharma, Devesh Kumar Srivastava

Abstract:

This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.

Keywords: hadoop, mapreduce, k-mediod, validation, verification

Procedia PDF Downloads 345
11493 Vulnerability Assessment for Protection of Ghardaia City to the Inundation of M’zabWadi

Authors: Mustapha Kamel Mihoubi, Reda Madi

Abstract:

The problem of natural disasters in general and flooding in particular is a topic which marks a memorable action in the world and specifically in cities and large urban areas. Torrential floods and faster flows pose a major problem in urban area. Indeed, a better management of risks of floods becomes a growing necessity that must mobilize technical and scientific means to curb the adverse consequences of this phenomenon, especially in the Saharan cities in arid climate. The aim of this study is to deploy a basic calculation approach based on a hydrologic and hydraulic quantification for locating the black spots in urban areas generated by the flooding and to locate the areas that are vulnerable to flooding. The principle of flooding method is applied to the city of Ghardaia to identify vulnerable areas to inundation and to establish maps management and prevention against the risks of flooding.

Keywords: Alea, Beni Mzab, cartography, HEC-RAS, inundation, torrential, vulnerability, wadi

Procedia PDF Downloads 282
11492 Seismic Microzonation of El-Fayoum New City, Egypt

Authors: Suzan Salem, Heba Moustafa, Abd El-Aziz Abd El-Aal

Abstract:

Seismic micro hazard zonation for urban areas is the first step towards a seismic risk analysis and mitigation strategy. Essential here is to obtain a proper understanding of the local subsurface conditions and to evaluate ground-shaking effects. In the present study, an attempt has been made to evaluate the seismic hazard considering local site effects by carrying out detailed geotechnical and geophysical site characterization in El-Fayoum New City. Seismic hazard analysis and microzonation of El-Fayoum New City are addressed in three parts: in the first part, estimation of seismic hazard is done using seismotectonic and geological information. The second part deals with site characterization using geotechnical and shallow geophysical techniques. In the last part, local site effects are assessed by carrying out one-dimensional (1-D) ground response analysis using the equivalent linear method by program SHAKE 2000. Finally, microzonation maps have been prepared. The detailed methodology, along with experimental details, collected data, results and maps are presented in this paper.

Keywords: El-Fayoum, microzonation, seismotectonic, Egypt

Procedia PDF Downloads 360
11491 Pakis and Whites: A Critical View of Nadeem Aslam’s Treatment of Racism in “Maps for Lost Lovers”

Authors: Humaira Tariq

Abstract:

An issue faced by a majority of immigrants, especially coming from the third world countries, is that of racism. The natives find it very hard to accept people of another race, origin and background amongst them. History is replete with incidents where immigrants have paid a heavy price for being the odd ones out. Being an integral part of the immigrant experience, this issue of racism, is an important theme in most of diaspora related fiction. The present paper will endeavor to expose and explore Nadeem Aslam’s handling of this theme in his novel, 'Maps for Lost Lovers'. The researcher has found Aslam to take an objective stance on this issue, as he shows that where the West is unwilling to accept the immigrants in their midst, there, majority of the immigrants, are also responsible for alienating themselves in the new environment. He shows a kind of persecution mania haunting the immigrants from the third world countries where they feel the condition for being much worse than it actually is. The paper presents a critical view of the handling of racism in Aslam’s novel where he is found to criticize not only the English for their mistreatment of Pakistani immigrants, but is also disapproving of the judgmental attitude of the immigrants.

Keywords: english, immigrants, natives, pakistani, racism

Procedia PDF Downloads 386
11490 Development and Evaluation of a Psychological Adjustment and Adaptation Status Scale for Breast Cancer Survivors

Authors: Jing Chen, Jun-E Liu, Peng Yue

Abstract:

Objective: The objective of this study was to develop a psychological adjustment and adaptation status scale for breast cancer survivors, and to examine the reliability and validity of the scale. Method: 37 breast cancer survivors were recruited in qualitative research; a five-subject theoretical framework and an item pool of 150 items of the scale were derived from the interview data. In order to evaluate and select items and reach a preliminary validity and reliability for the original scale, the suggestions of study group members, experts and breast cancer survivors were taken, and statistical methods were used step by step in a sample of 457 breast cancer survivors. Results: An original 24-item scale was developed. The five dimensions “domestic affections”, “interpersonal relationship”, “attitude of life”, “health awareness”, “self-control/self-efficacy” explained 58.053% of the total variance. The content validity was assessed by experts, the CVI was 0.92. The construct validity was examined in a sample of 264 breast cancer survivors. The fitting indexes of confirmatory factor analysis (CFA) showed good fitting of the five dimensions model. The criterion-related validity of the total scale with PTGI was satisfactory (r=0.564, p<0.001). The internal consistency reliability and test-retest reliability were tested. Cronbach’s alpha value (0.911) showed a good internal consistency reliability, and the intraclass correlation coefficient (ICC=0.925, p<0.001) showed a satisfactory test-retest reliability. Conclusions: The scale was brief and easy to understand, was suitable for breast cancer patients whose physical strength and energy were limited.

Keywords: breast cancer survivors, rehabilitation, psychological adaption and adjustment, development of scale

Procedia PDF Downloads 495
11489 Tactile Sensory Digit Feedback for Cochlear Implant Electrode Insertion

Authors: Yusuf Bulale, Mark Prince, Geoff Tansley, Peter Brett

Abstract:

Cochlear Implantation (CI) which became a routine procedure for the last decades is an electronic device that provides a sense of sound for patients who are severely and profoundly deaf. Today, cochlear implantation technology uses electrode array (EA) implanted manually into the cochlea. The optimal success of this implantation depends on the electrode technology and deep insertion techniques. However, this manual insertion procedure may cause mechanical trauma which can lead to a severe destruction of the delicate intracochlear structure. Accordingly, future improvement of the cochlear electrode implant insertion needs reduction of the excessive force application during the cochlear implantation which causes tissue damage and trauma. This study is examined tool-tissue interaction of large prototype scale digit embedded with distributive tactile sensor based upon cochlear electrode and large prototype scale cochlea phantom for simulating the human cochlear which could lead to small-scale digit requirements. The digit, distributive tactile sensors embedded with silicon-substrate was inserted into the cochlea phantom to measure any digit/phantom interaction and position of the digit in order to minimize tissue and trauma damage during the electrode cochlear insertion. The digit has provided tactile information from the digit-phantom insertion interaction such as contact status, tip penetration, obstacles, relative shape and location, contact orientation and multiple contacts. The tests demonstrated that even devices of such a relative simple design with low cost have a potential to improve cochlear implant surgery and other lumen mapping applications by providing tactile sensory feedback information and thus controlling the insertion through sensing and control of the tip of the implant during the insertion. In that approach, the surgeon could minimize the tissue damage and potential damage to the delicate structures within the cochlear caused by current manual electrode insertion of the cochlear implantation. This approach also can be applied to other minimally invasive surgery applications as well as diagnosis and path navigation procedures.

Keywords: cochlear electrode insertion, distributive tactile sensory feedback information, flexible digit, minimally invasive surgery, tool/tissue interaction

Procedia PDF Downloads 368
11488 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis

Authors: Gon Park

Abstract:

Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.

Keywords: cadastral data, green Infrastructure, network analysis, parcel data

Procedia PDF Downloads 176
11487 Techniques to Characterize Subpopulations among Hearing Impaired Patients and Its Impact for Hearing Aid Fitting

Authors: Vijaya K. Narne, Gerard Loquet, Tobias Piechowiak, Dorte Hammershoi, Jesper H. Schmidt

Abstract:

BEAR, which stands for better hearing rehabilitation is a large-scale project in Denmark designed and executed by three national universities, three hospitals, and the hearing aid industry with the aim to improve hearing aid fitting. A total of 1963 hearing impaired people were included and were segmented into subgroups based on hearing-loss, demographics, audiological and questionnaires data (i.e., the speech, spatial and qualities of hearing scale [SSQ-12] and the International Outcome Inventory for Hearing-Aids [IOI-HA]). With the aim to provide a better hearing-aid fit to individual patients, we applied modern machine learning techniques with traditional audiograms rule-based systems. Results show that age, speech discrimination scores, and audiogram configurations were evolved as important parameters in characterizing sub-population from the data-set. The attempt to characterize sub-population reveal a clearer picture about the individual hearing difficulties encountered and the benefits derived from more individualized hearing aids.

Keywords: hearing loss, audiological data, machine learning, hearing aids

Procedia PDF Downloads 134
11486 Economic Evaluation of an Advanced Bioethanol Manufacturing Technology Using Maize as a Feedstock in South Africa

Authors: Ayanda Ndokwana, Stanley Fore

Abstract:

Industrial prosperity and rapid expansion of human population in South Africa over the past two decades, have increased the use of conventional fossil fuels such as crude oil, coal and natural gas to meet the country’s energy demands. However, the inevitable depletion of fossil fuel reserves, global volatile oil price and large carbon footprint are some of the crucial reasons the South African Government needs to make a considerable investment in the development of the biofuel industry. In South Africa, this industry is still at the introductory stage with no large scale manufacturing plant that has been commissioned yet. Bioethanol is a potential replacement of gasoline which is a fossil fuel that is used in motor vehicles. Using bioethanol for the transport sector as a source of fuel will help Government to save heavy foreign exchange incurred during importation of oil and create many job opportunities in rural farming. In 2007, the South African Government developed the National Biofuels Industrial Strategy in an effort to make provision for support and attract investment in bioethanol production. However, capital investment in the production of bioethanol on a large scale, depends on the sound economic assessment of the available manufacturing technologies. The aim of this study is to evaluate the profitability of an advanced bioethanol manufacturing technology which uses maize as a feedstock in South Africa. The impact of fiber or bran fractionation in this technology causes it to possess a number of merits such as energy efficiency, low capital expenditure, and profitability compared to a conventional dry-mill bioethanol technology. Quantitative techniques will be used to collect and analyze numerical data from suitable organisations in South Africa. The dependence of three profitability indicators such as the Discounted Payback Period (DPP), Net Present Value (NPV) and Return On Investment (ROI) on plant capacity will be evaluated. Profitability analysis will be done on the following plant capacities: 100 000 ton/year, 150 000 ton/year and 200 000 ton/year. The plant capacity with the shortest Discounted Payback Period, positive Net Present Value and highest Return On Investment implies that a further consideration in terms of capital investment is warranted.

Keywords: bioethanol, economic evaluation, maize, profitability indicators

Procedia PDF Downloads 206
11485 Enhancing the Bionic Eye: A Real-time Image Optimization Framework to Encode Color and Spatial Information Into Retinal Prostheses

Authors: William Huang

Abstract:

Retinal prostheses are currently limited to low resolution grayscale images that lack color and spatial information. This study develops a novel real-time image optimization framework and tools to encode maximum information to the prostheses which are constrained by the number of electrodes. One key idea is to localize main objects in images while reducing unnecessary background noise through region-contrast saliency maps. A novel color depth mapping technique was developed through MiniBatchKmeans clustering and color space selection. The resulting image was downsampled using bicubic interpolation to reduce image size while preserving color quality. In comparison to current schemes, the proposed framework demonstrated better visual quality in tested images. The use of the region-contrast saliency map showed improvements in efficacy up to 30%. Finally, the computational speed of this algorithm is less than 380 ms on tested cases, making real-time retinal prostheses feasible.

Keywords: retinal implants, virtual processing unit, computer vision, saliency maps, color quantization

Procedia PDF Downloads 124
11484 Simultaneous Optimization of Design and Maintenance through a Hybrid Process Using Genetic Algorithms

Authors: O. Adjoul, A. Feugier, K. Benfriha, A. Aoussat

Abstract:

In general, issues related to design and maintenance are considered in an independent manner. However, the decisions made in these two sets influence each other. The design for maintenance is considered an opportunity to optimize the life cycle cost of a product, particularly in the nuclear or aeronautical field, where maintenance expenses represent more than 60% of life cycle costs. The design of large-scale systems starts with product architecture, a choice of components in terms of cost, reliability, weight and other attributes, corresponding to the specifications. On the other hand, the design must take into account maintenance by improving, in particular, real-time monitoring of equipment through the integration of new technologies such as connected sensors and intelligent actuators. We noticed that different approaches used in the Design For Maintenance (DFM) methods are limited to the simultaneous characterization of the reliability and maintainability of a multi-component system. This article proposes a method of DFM that assists designers to propose dynamic maintenance for multi-component industrial systems. The term "dynamic" refers to the ability to integrate available monitoring data to adapt the maintenance decision in real time. The goal is to maximize the availability of the system at a given life cycle cost. This paper presents an approach for simultaneous optimization of the design and maintenance of multi-component systems. Here the design is characterized by four decision variables for each component (reliability level, maintainability level, redundancy level, and level of monitoring data). The maintenance is characterized by two decision variables (the dates of the maintenance stops and the maintenance operations to be performed on the system during these stops). The DFM model helps the designers choose technical solutions for the large-scale industrial products. Large-scale refers to the complex multi-component industrial systems and long life-cycle, such as trains, aircraft, etc. The method is based on a two-level hybrid algorithm for simultaneous optimization of design and maintenance, using genetic algorithms. The first level is to select a design solution for a given system that considers the life cycle cost and the reliability. The second level consists of determining a dynamic and optimal maintenance plan to be deployed for a design solution. This level is based on the Maintenance Free Operating Period (MFOP) concept, which takes into account the decision criteria such as, total reliability, maintenance cost and maintenance time. Depending on the life cycle duration, the desired availability, and the desired business model (sales or rental), this tool provides visibility of overall costs and optimal product architecture.

Keywords: availability, design for maintenance (DFM), dynamic maintenance, life cycle cost (LCC), maintenance free operating period (MFOP), simultaneous optimization

Procedia PDF Downloads 93
11483 Tapered Double Cantilever Beam: Evaluation of the Test Set-up for Self-Healing Polymers

Authors: Eleni Tsangouri, Xander Hillewaere, David Garoz Gómez, Dimitrios Aggelis, Filip Du Prez, Danny Van Hemelrijck

Abstract:

Tapered Double Cantilever Beam (TDCB) is the most commonly used test set-up to evaluate the self-healing feature of thermoset polymers autonomously activated in the presence of crack. TDCB is a modification of the established fracture mechanics set-up of Double Cantilever Beam and is designed to provide constant strain energy release rate with crack length under stable load evolution (mode-I). In this study, the damage of virgin and autonomously healed TDCB polymer samples is evaluated considering the load-crack opening diagram, the strain maps provided by Digital Image Correlation technique and the fractography maps given by optical microscopy. It is shown that the pre-crack introduced prior to testing (razor blade tapping), the loading rate and the length of the side groove are the features that dominate the crack propagation and lead to inconstant fracture energy release rate.

Keywords: polymers, autonomous healing, fracture, tapered double cantilever beam

Procedia PDF Downloads 334
11482 Micro-Scale Digital Image Correlation-Driven Finite Element Simulations of Deformation and Damage Initiation in Advanced High Strength Steels

Authors: Asim Alsharif, Christophe Pinna, Hassan Ghadbeigi

Abstract:

The development of next-generation advanced high strength steels (AHSS) used in the automotive industry requires a better understanding of local deformation and damage development at the scale of their microstructures. This work is focused on dual-phase DP1000 steels and involves micro-mechanical tensile testing inside a scanning electron microscope (SEM) combined with digital image correlation (DIC) to quantify the heterogeneity of deformation in both ferrite and martensite and its evolution up to fracture. Natural features of the microstructure are used for the correlation carried out using Davis LaVision software. Strain localization is observed in both phases with tensile strain values up to 130% and 110% recorded in ferrite and martensite respectively just before final fracture. Damage initiation sites have been observed during deformation in martensite but could not be correlated to local strain values. A finite element (FE) model of the microstructure has then been developed using Abaqus to map stress distributions over representative areas of the microstructure by forcing the model to deform as in the experiment using DIC-measured displacement maps as boundary conditions. A MATLAB code has been developed to automatically mesh the microstructure from SEM images and to map displacement vectors from DIC onto the FE mesh. Results show a correlation of damage initiation at the interface between ferrite and martensite with local principal stress values of about 1700MPa in the martensite phase. Damage in ferrite is now being investigated, and results are expected to bring new insight into damage development in DP steels.

Keywords: advanced high strength steels, digital image correlation, finite element modelling, micro-mechanical testing

Procedia PDF Downloads 124
11481 An Analysis of Economical Drivers and Technical Challenges for Large-Scale Biohydrogen Deployment

Authors: Rouzbeh Jafari, Joe Nava

Abstract:

This study includes learnings from an engineering practice normally performed on large scale biohydrogen processes. If properly scale-up is done, biohydrogen can be a reliable pathway for biowaste valorization. Most of the studies on biohydrogen process development have used model feedstock to investigate process key performance indicators (KPIs). This study does not intend to compare different technologies with model feedstock. However, it reports economic drivers and technical challenges which help in developing a road map for expanding biohydrogen economy deployment in Canada. BBA is a consulting firm responsible for the design of hydrogen production projects. Through executing these projects, activity has been performed to identify, register and mitigate technical drawbacks of large-scale hydrogen production. Those learnings, in this study, have been applied to the biohydrogen process. Through data collected by a comprehensive literature review, a base case has been considered as a reference, and several case studies have been performed. Critical parameters of the process were identified and through common engineering practice (process design, simulation, cost estimate, and life cycle assessment) impact of these parameters on the commercialization risk matrix and class 5 cost estimations were reported. The process considered in this study is food waste and woody biomass dark fermentation. To propose a reliable road map to develop a sustainable biohydrogen production process impact of critical parameters was studied on the end-to-end process. These parameters were 1) feedstock composition, 2) feedstock pre-treatment, 3) unit operation selection, and 4) multi-product concept. A couple of emerging technologies also were assessed such as photo-fermentation, integrated dark fermentation, and using ultrasound and microwave to break-down feedstock`s complex matrix and increase overall hydrogen yield. To properly report the impact of each parameter KPIs were identified as 1) Hydrogen yield, 2) energy consumption, 3) secondary waste generated, 4) CO2 footprint, 5) Product profile, 6) $/kg-H2 and 5) environmental impact. The feedstock is the main parameter defining the economic viability of biohydrogen production. Through parametric studies, it was found that biohydrogen production favors feedstock with higher carbohydrates. The feedstock composition was varied, by increasing one critical element (such as carbohydrate) and monitoring KPIs evolution. Different cases were studied with diverse feedstock, such as energy crops, wastewater slug, and lignocellulosic waste. The base case process was applied to have reference KPIs values and modifications such as pretreatment and feedstock mix-and-match were implemented to investigate KPIs changes. The complexity of the feedstock is the main bottleneck in the successful commercial deployment of the biohydrogen process as a reliable pathway for waste valorization. Hydrogen yield, reaction kinetics, and performance of key unit operations highly impacted as feedstock composition fluctuates during the lifetime of the process or from one case to another. In this case, concept of multi-product becomes more reliable. In this concept, the process is not designed to produce only one target product such as biohydrogen but will have two or multiple products (biohydrogen and biomethane or biochemicals). This new approach is being investigated by the BBA team and the results will be shared in another scientific contribution.

Keywords: biohydrogen, process scale-up, economic evaluation, commercialization uncertainties, hydrogen economy

Procedia PDF Downloads 78
11480 The Impact of Electronic Marketing on the Quality Banking Services

Authors: Ahmed Ghalem

Abstract:

The research to be explained is a collection of information about several public and private economic institutions. This information is represented in highlighting the large and useful role in adopting the method of electronic marketing. Which is widespread and easy to use among community members at the local and international levels. Which generates large sums of money with little effort and little time, and also satisfies the customers. Do these things, despite what we have said, run the risk of losing large amounts of money in a moment or a short time.

Keywords: economic, finance, bank, development, marketing

Procedia PDF Downloads 67
11479 Practical Experiences in the Development of a Lab-Scale Process for the Production and Recovery of Fucoxanthin

Authors: Alma Gómez-Loredo, José González-Valdez, Jorge Benavides, Marco Rito-Palomares

Abstract:

Fucoxanthin is a carotenoid that exerts multiple beneficial effects on human health, including antioxidant, anti-cancer, antidiabetic and anti-obesity activity; making the development of a whole process for its production and recovery an important contribution. In this work, the lab-scale production and purification of fucoxanthin in Isocrhysis galbana have been studied. In batch cultures, low light intensities (13.5 μmol/m2s) and bubble agitation were the best conditions for production of the carotenoid with product yields of up to 0.143 mg/g. After fucoxanthin ethanolic extraction from biomass and hexane partition, further recovery and purification of the carotenoid has been accomplished by means of alcohol – salt Aqueous Two-Phase System (ATPS) extraction followed by an ultrafiltration (UF) step. An ATPS comprised of ethanol and potassium phosphate (Volume Ratio (VR) =3; Tie-line Length (TLL) 60% w/w) presented a fucoxanthin recovery yield of 76.24 ± 1.60% among the studied systems and was able to remove 64.89 ± 2.64% of the carotenoid and chlorophyll pollutants. For UF, the addition of ethanol to the original recovered ethanolic ATPS stream to a final relation of 74.15% (w/w) resulted in a reduction of approximately 16% of the protein contents, increasing product purity with a recovery yield of about 63% of the compound in the permeate stream. Considering the production, extraction and primary recovery (ATPS and UF) steps, around a 45% global fucoxanthin recovery should be expected. Although other purification technologies, such as Centrifugal Partition Chromatography are able to obtain fucoxanthin recoveries of up to 83%, the process developed in the present work does not require large volumes of solvents or expensive equipment. Moreover, it has a potential for scale up to commercial scale and represents a cost-effective strategy when compared to traditional separation techniques like chromatography.

Keywords: aqueous two-phase systems, fucoxanthin, Isochrysis galbana, microalgae, ultrafiltration

Procedia PDF Downloads 399
11478 Assessment of E-Learning Facilities in Open and Distance Learning and Information Need by Students

Authors: Sabo Elizabeth

Abstract:

Electronic learning is increasingly popular learning approach in higher educational institutions due to vast growth of internet technology. This approach is important in human capital development. An investigation of open distance and e-learning facilities and information need by open and distance learning students was carried out in Jalingo, Nigeria. Structured questionnaires were administered to 70 registered ODL students of the NOUN. Information sourced from the respondents covered demographic, economic and institutional variables. Data collected for demographic variables were computed as frequency count and percentages. Assessment of the effectiveness of ODL facilities and information need among open and distance learning students was computed on a three or four point Likert Rating Scale. Findings indicated that there are more men compared to women. A large proportion of the respondents are married and there are more matured students in ODL compared to the youth. A high proportion of the ODL students obtained qualifications higher than the secondary school certificate. The proportion of computer literate ODL students was high, and large number of the students does not own a laptop computer. Inadequate e -books and reference materials, internet gadgets and inadequate books (hard copies) and reference material are factors that limit utilization of e-learning facilities in the study areas. Inadequate computer facilities and power back up caused inconveniences and delay in administering and use of e learning facilities. To a high extent, open and distance learning students needed information on university time table and schedule of activities, availability and access to books (hard and e-books) and reference materials. The respondents emphasized that contact with course coordinators via internet will provide a better learning and academic performance.

Keywords: open and distance learning, information required, electronic books, internet gadgets, Likert scale test

Procedia PDF Downloads 308
11477 Multiple Fusion Based Single Image Dehazing

Authors: Joe Amalraj, M. Arunkumar

Abstract:

Haze is an atmospheric phenomenon that signicantly degrades the visibility of outdoor scenes. This is mainly due to the atmosphere particles that absorb and scatter the light. This paper introduces a novel single image approach that enhances the visibility of such degraded images. In this method is a fusion-based strategy that derives from two original hazy image inputs by applying a white balance and a contrast enhancing procedure. To blend effectively the information of the derived inputs to preserve the regions with good visibility, we filter their important features by computing three measures (weight maps): luminance, chromaticity, and saliency. To minimize artifacts introduced by the weight maps, our approach is designed in a multiscale fashion, using a Laplacian pyramid representation. This paper demonstrates the utility and effectiveness of a fusion-based technique for de-hazing based on a single degraded image. The method performs in a per-pixel fashion, which is straightforward to implement. The experimental results demonstrate that the method yields results comparative to and even better than the more complex state-of-the-art techniques, having the advantage of being appropriate for real-time applications.

Keywords: single image de-hazing, outdoor images, enhancing, DSP

Procedia PDF Downloads 386