Search results for: large scale maps
11799 Simulation of Flood Inundation in Kedukan River Using HEC-RAS and GIS
Authors: Reini S. Ilmiaty, Muhammad B. Al Amin, Sarino, Muzamil Jariski
Abstract:
Kedukan River is an artificial river which serves as a Watershed Boang drainage channel in Palembang. The river has upstream and downstream connected to Musi River, that often overflowing and flooding caused by the huge runoff discharge and high tide water level of Musi River. This study aimed to analyze the flood water surface profile on Kedukan River continued with flood inundation simulation to determine flooding prone areas in research area. The analysis starts from the peak runoff discharge calculations using rational method followed by water surface profile analysis using HEC-RAS program controlled by manual calculations using standard stages. The analysis followed by running flood inundation simulation using ArcGIS program that has been integrated with HEC-GeoRAS. Flood inundation simulation on Kedukan River creates inundation characteristic maps with depth, area, and circumference of inundation as the parameters. The inundation maps are very useful in providing an overview of flood prone areas in Kedukan River.Keywords: flood modelling, HEC-GeoRAS, HEC-RAS, inundation map
Procedia PDF Downloads 51211798 Interacting with Multi-Scale Structures of Online Political Debates by Visualizing Phylomemies
Authors: Quentin Lobbe, David Chavalarias, Alexandre Delanoe
Abstract:
The ICT revolution has given birth to an unprecedented world of digital traces and has impacted a wide number of knowledge-driven domains such as science, education or policy making. Nowadays, we are daily fueled by unlimited flows of articles, blogs, messages, tweets, etc. The internet itself can thus be considered as an unsteady hyper-textual environment where websites emerge and expand every day. But there are structures inside knowledge. A given text can always be studied in relation to others or in light of a specific socio-cultural context. By way of their textual traces, human beings are calling each other out: hypertext citations, retweets, vocabulary similarity, etc. We are in fact the architects of a giant web of elements of knowledge whose structures and shapes convey their own information. The global shapes of these digital traces represent a source of collective knowledge and the question of their visualization remains an opened challenge. How can we explore, browse and interact with such shapes? In order to navigate across these growing constellations of words and texts, interdisciplinary innovations are emerging at the crossroad between fields of social and computational sciences. In particular, complex systems approaches make it now possible to reconstruct the hidden structures of textual knowledge by means of multi-scale objects of research such as semantic maps and phylomemies. The phylomemy reconstruction is a generic method related to the co-word analysis framework. Phylomemies aim to reveal the temporal dynamics of large corpora of textual contents by performing inter-temporal matching on extracted knowledge domains in order to identify their conceptual lineages. This study aims to address the question of visualizing the global shapes of online political discussions related to the French presidential and legislative elections of 2017. We aim to build phylomemies on top of a dedicated collection of thousands of French political tweets enriched with archived contemporary news web articles. Our goal is to reconstruct the temporal evolution of online debates fueled by each political community during the elections. To that end, we want to introduce an iterative data exploration methodology implemented and tested within the free software Gargantext. There we combine synchronic and diachronic axis of visualization to reveal the dynamics of our corpora of tweets and web pages as well as their inner syntagmatic and paradigmatic relationships. In doing so, we aim to provide researchers with innovative methodological means to explore online semantic landscapes in a collaborative and reflective way.Keywords: online political debate, French election, hyper-text, phylomemy
Procedia PDF Downloads 18611797 Optimal Control of Generators and Series Compensators within Multi-Space-Time Frame
Authors: Qian Chen, Lin Xu, Ping Ju, Zhuoran Li, Yiping Yu, Yuqing Jin
Abstract:
The operation of power grid is becoming more and more complex and difficult due to its rapid development towards high voltage, long distance, and large capacity. For instance, many large-scale wind farms have connected to power grid, where their fluctuation and randomness is very likely to affect the stability and safety of the grid. Fortunately, many new-type equipments based on power electronics have been applied to power grid, such as UPFC (Unified Power Flow Controller), TCSC (Thyristor Controlled Series Compensation), STATCOM (Static Synchronous Compensator) and so on, which can help to deal with the problem above. Compared with traditional equipment such as generator, new-type controllable devices, represented by the FACTS (Flexible AC Transmission System), have more accurate control ability and respond faster. But they are too expensive to use widely. Therefore, on the basis of the comparison and analysis of the controlling characteristics between traditional control equipment and new-type controllable equipment in both time and space scale, a coordinated optimizing control method within mutil-time-space frame is proposed in this paper to bring both kinds of advantages into play, which can better both control ability and economical efficiency. Firstly, the coordination of different space sizes of grid is studied focused on the fluctuation caused by large-scale wind farms connected to power grid. With generator, FSC (Fixed Series Compensation) and TCSC, the coordination method on two-layer regional power grid vs. its sub grid is studied in detail. The coordination control model is built, the corresponding scheme is promoted, and the conclusion is verified by simulation. By analysis, interface power flow can be controlled by generator and the specific line power flow between two-layer regions can be adjusted by FSC and TCSC. The smaller the interface power flow adjusted by generator, the bigger the control margin of TCSC, instead, the total consumption of generator is much higher. Secondly, the coordination of different time sizes is studied to further the amount of the total consumption of generator and the control margin of TCSC, where the minimum control cost can be acquired. The coordination method on two-layer ultra short-term correction vs. AGC (Automatic Generation Control) is studied with generator, FSC and TCSC. The optimal control model is founded, genetic algorithm is selected to solve the problem, and the conclusion is verified by simulation. Finally, the aforementioned method within multi-time-space scale is analyzed with practical cases, and simulated on PSASP (Power System Analysis Software Package) platform. The correctness and effectiveness are verified by the simulation result. Moreover, this coordinated optimizing control method can contribute to the decrease of control cost and will provide reference to the following studies in this field.Keywords: FACTS, multi-space-time frame, optimal control, TCSC
Procedia PDF Downloads 26711796 Building Information Modelling Based Value for Money Assessment in Public-Private Partnership
Authors: Guoqian Ren, Haijiang Li, Jisong Zhang
Abstract:
Over the past 40 years, urban development has undergone large-scale, high-speed expansion, beyond what was previously considered normal and in a manner not proportionally related to population growth or physical considerations. With more scientific and refined decision-making in the urban construction process, new urbanization approaches, aligned with public-private partnerships (PPPs) which evolved in the early 1990s, have become acceptable and, in some situations, even better solutions to outstanding urban municipal construction projects, especially in developing countries. However, as the main driving force to deal with urban public services, PPPs are still problematic regarding value for money (VFM) process in most large-scale construction projects. This paper therefore reviews recent PPP articles in popular project management journals and relevant toolkits, published in the last 10 years, to identify the indicators that influence VFM within PPPs across regions. With increasing concerns about profitability and environmental and social impacts, the current PPP structure requires a more integrated platform to manage multi-performance project life cycles. Building information modelling (BIM), a popular approach to the procurement process in AEC sectors, provides the potential to ensure VFM while also working in tandem with the semantic approach to holistically measure life cycle costs (LCC) and achieve better sustainability. This paper suggests that BIM applied to the entire PPP life cycle could support holistic decision-making regarding VFM processes and thus meet service targets.Keywords: public-private partnership, value for money, building information modelling, semantic approach
Procedia PDF Downloads 20911795 Development of Algorithms for Solving and Analyzing Special Problems Transports Type
Authors: Dmitri Terzi
Abstract:
The article presents the results of an algorithmic study of a special optimization problem of the transport type (traveling salesman problem): 1) To solve the problem, a new natural algorithm has been developed based on the decomposition of the initial data into convex hulls, which has a number of advantages; it is applicable for a fairly large dimension, does not require a large amount of memory, and has fairly good performance. The relevance of the algorithm lies in the fact that, in practice, programs for problems with the number of traversal points of no more than twenty are widely used. For large-scale problems, the availability of algorithms and programs of this kind is difficult. The proposed algorithm is natural because the optimal solution found by the exact algorithm is not always feasible due to the presence of many other factors that may require some additional restrictions. 2) Another inverse problem solved here is to describe a class of traveling salesman problems that have a predetermined optimal solution. The constructed algorithm 2 allows us to characterize the structure of traveling salesman problems, as well as construct test problems to evaluate the effectiveness of algorithms and other purposes. 3) The appendix presents a software implementation of Algorithm 1 (in MATLAB), which can be used to solve practical problems, as well as in the educational process on operations research and optimization methods.Keywords: traveling salesman problem, solution construction algorithm, convex hulls, optimality verification
Procedia PDF Downloads 7311794 Assessing an Instrument Usability: Response Interpolation and Scale Sensitivity
Authors: Betsy Ng, Seng Chee Tan, Choon Lang Quek, Peter Looker, Jaime Koh
Abstract:
The purpose of the present study was to determine the particular scale rating that stands out for an instrument. The instrument was designed to assess student perceptions of various learning environments, namely face-to-face, online and blended. The original instrument had a 5-point Likert items (1 = strongly disagree and 5 = strongly agree). Alternate versions were modified with a 6-point Likert scale and a bar scale rating. Participants consisted of undergraduates in a local university were involved in the usability testing of the instrument in an electronic setting. They were presented with the 5-point, 6-point and percentage-bar (100-point) scale ratings, in response to their perceptions of learning environments. The 5-point and 6-point Likert scales were presented in the form of radio button controls for each number, while the percentage-bar scale was presented with a sliding selection. Among these responses, 6-point Likert scale emerged to be the best overall. When participants were confronted with the 5-point items, they either chose 3 or 4, suggesting that data loss could occur due to the insensitivity of instrument. The insensitivity of instrument could be due to the discreet options, as evidenced by response interpolation. To avoid the constraint of discreet options, the percentage-bar scale rating was tested, but the participant responses were not well-interpolated. The bar scale might have provided a variety of responses without a constraint of a set of categorical options, but it seemed to reflect a lack of perceived and objective accuracy. The 6-point Likert scale was more likely to reflect a respondent’s perceived and objective accuracy as well as higher sensitivity. This finding supported the conclusion that 6-point Likert items provided a more accurate measure of the participant’s evaluation. The 5-point and bar scale ratings might not be accurately measuring the participants’ responses. This study highlighted the importance of the respondent’s perception of accuracy, respondent’s true evaluation, and the scale’s ease of use. Implications and limitations of this study were also discussed.Keywords: usability, interpolation, sensitivity, Likert scales, accuracy
Procedia PDF Downloads 40611793 Seismic Microzonation of El-Fayoum New City, Egypt
Authors: Suzan Salem, Heba Moustafa, Abd El-Aziz Abd El-Aal
Abstract:
Seismic micro hazard zonation for urban areas is the first step towards a seismic risk analysis and mitigation strategy. Essential here is to obtain a proper understanding of the local subsurface conditions and to evaluate ground-shaking effects. In the present study, an attempt has been made to evaluate the seismic hazard considering local site effects by carrying out detailed geotechnical and geophysical site characterization in El-Fayoum New City. Seismic hazard analysis and microzonation of El-Fayoum New City are addressed in three parts: in the first part, estimation of seismic hazard is done using seismotectonic and geological information. The second part deals with site characterization using geotechnical and shallow geophysical techniques. In the last part, local site effects are assessed by carrying out one-dimensional (1-D) ground response analysis using the equivalent linear method by program SHAKE 2000. Finally, microzonation maps have been prepared. The detailed methodology, along with experimental details, collected data, results and maps are presented in this paper.Keywords: El-Fayoum, microzonation, seismotectonic, Egypt
Procedia PDF Downloads 38111792 Effect of Residential Block Scale Envelope in Buildings Energy Consumption: A Vernacular Case Study in an Iranian Urban Context
Authors: M. Panahian
Abstract:
A global challenge which is of paramount significance today is the issue of devising innovative solutions to tackle the environmental issues, as well as more intelligent and foresightful consumption of and management of natural resources. Changes in global climate resulting from the burning of fossil fuel and the rise in the level of energy consumption are a few examples of environmental issues detrimental to any form of life on earth, which are aggravated year by year. Overall, energy-efficient designs and construction strategies can be studied at three scales: building, block, and city. Nevertheless, as the available literature suggests, the greatest emphasis has been on building and city scales, and little has been done as to the energy-efficient designs at block scale. Therefore, the aim of the current research is to investigate the influences of residential block scale envelope on the energy consumption in buildings. To this end, a case study of residential block scale has been selected in the city of Isfahan, in Iran, situated in a hot and dry climate with cold winters. Eventually, the most effective variables in energy consumption, concerning the block scale envelope, will be concluded.Keywords: sustainability, passive energy saving solutions, residential block scale, energy efficiency
Procedia PDF Downloads 24111791 Pakis and Whites: A Critical View of Nadeem Aslam’s Treatment of Racism in “Maps for Lost Lovers”
Authors: Humaira Tariq
Abstract:
An issue faced by a majority of immigrants, especially coming from the third world countries, is that of racism. The natives find it very hard to accept people of another race, origin and background amongst them. History is replete with incidents where immigrants have paid a heavy price for being the odd ones out. Being an integral part of the immigrant experience, this issue of racism, is an important theme in most of diaspora related fiction. The present paper will endeavor to expose and explore Nadeem Aslam’s handling of this theme in his novel, 'Maps for Lost Lovers'. The researcher has found Aslam to take an objective stance on this issue, as he shows that where the West is unwilling to accept the immigrants in their midst, there, majority of the immigrants, are also responsible for alienating themselves in the new environment. He shows a kind of persecution mania haunting the immigrants from the third world countries where they feel the condition for being much worse than it actually is. The paper presents a critical view of the handling of racism in Aslam’s novel where he is found to criticize not only the English for their mistreatment of Pakistani immigrants, but is also disapproving of the judgmental attitude of the immigrants.Keywords: english, immigrants, natives, pakistani, racism
Procedia PDF Downloads 40911790 Soil Improvement through Utilization of Calcifying Bhargavaea cecembensis N1 in an Affordable Whey Culture Medium
Authors: Fatemeh Elmi, Zahra Etemadifar
Abstract:
Improvement of soil mechanical properties is crucial before its use in construction, as the low mechanical strength and unstable structure of soil in many parts of the world can lead to the destruction of engineering infrastructure, resulting in financial and human losses. Although, conventional methods, such as chemical injection, are often utilized to enhance soil strength and stiffness, they are generally expensive, require heavy machinery, and cause significant environmental effects due to chemical usage, and also disrupt urban infrastructure. Moreover, they are not suitable for treating large volume of soil. Recently, an alternative method to improve various soil properties, including strength, hardness, and permeability, has received much attention: the application of biological methods. One of the most widely used is biocementation, which is based on the microbial precipitation of calcium carbonte crystalls using ureolytic bacteria However, there are still limitations to its large-scale use that need to be resolved before it can be commercialized. These issues have not received enough attention in prior research. One limitation of MICP (microbially induced calcium carbonate precipitation) is that microorganisms cannot operate effectively in harsh and variable environments, unlike the controlled conditions of a laboratory. Another limitation of applying this technique on a large scale is the high cost of producing a substantial amount of bacterial culture and reagents required for soil treatment. Therefore, the purpose of the present study was to investigate soil improvement using the biocementation activity of poly-extremophile, calcium carbonate crystal- producing bacterial strain, Bhargavaea cecembensis N1, in whey as an inexpensive medium. This strain was isolated and molecularly identified from sandy soils in our previous research, and its 16S rRNA gene sequences was deposited in the NCBI Gene Bank with an accession number MK420385. This strain exhibited a high level of urease activity (8.16 U/ml) and produced a large amount of calcium carbonate (4.1 mg/ ml). It was able to improve the soil by increasing the compressive strength up to 205 kPa and reducing permeability by 36%, with 20% of the improvement attributable of calcium carbonate production. This was achieved using this strain in a whey culture medium. This strain can be an eco-friendly and economical alternative to conventional methods in soil stabilization, and other MICP related applications.Keywords: biocementation, Bhargavaea cecembensis, soil improvement, whey culture medium
Procedia PDF Downloads 5411789 Study on Shape Coefficient of Large Statue Building Based on CFD
Authors: Wang Guangda, Ma Jun, Zhao Caiqi, Pan Rui
Abstract:
Wind load is the main control load of large statue structures. Due to the irregular plane and elevation and uneven outer contour, statues’ shape coefficient can not pick up from the current code. Currently a common practice is based on wind tunnel test. But this method is time-consuming and high cost. In this paper, based on the fundamental theory of CFD, using fluid dynamics software of Fluent 15.0, a few large statue structure of 40 to 70m high, which are located in china , including large fairy statues and large Buddha statues, are analyzed by numerical wind tunnel. The results are contrasted with the recommended values in load code and the wind tunnel test results respectively. Results show that the shape coefficient has a good reliability by the numerical wind tunnel method of this kind of building. This will has a certain reference value of wind load values for large statues’ structure.Keywords: large statue structure, shape coefficient, irregular structure, wind tunnel test, numerical wind tunnel simulation
Procedia PDF Downloads 37511788 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis
Authors: Gon Park
Abstract:
Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.Keywords: cadastral data, green Infrastructure, network analysis, parcel data
Procedia PDF Downloads 20611787 Vulnerability Assessment for Protection of Ghardaia City to the Inundation of M’zabWadi
Authors: Mustapha Kamel Mihoubi, Reda Madi
Abstract:
The problem of natural disasters in general and flooding in particular is a topic which marks a memorable action in the world and specifically in cities and large urban areas. Torrential floods and faster flows pose a major problem in urban area. Indeed, a better management of risks of floods becomes a growing necessity that must mobilize technical and scientific means to curb the adverse consequences of this phenomenon, especially in the Saharan cities in arid climate. The aim of this study is to deploy a basic calculation approach based on a hydrologic and hydraulic quantification for locating the black spots in urban areas generated by the flooding and to locate the areas that are vulnerable to flooding. The principle of flooding method is applied to the city of Ghardaia to identify vulnerable areas to inundation and to establish maps management and prevention against the risks of flooding.Keywords: Alea, Beni Mzab, cartography, HEC-RAS, inundation, torrential, vulnerability, wadi
Procedia PDF Downloads 31111786 A Study on Accident Result Contribution of Individual Major Variables Using Multi-Body System of Accident Reconstruction Program
Authors: Donghun Jeong, Somyoung Shin, Yeoil Yun
Abstract:
A large-scale traffic accident refers to an accident in which more than three people die or more than thirty people are dead or injured. In order to prevent a large-scale traffic accident from causing a big loss of lives or establish effective improvement measures, it is important to analyze accident situations in-depth and understand the effects of major accident variables on an accident. This study aims to analyze the contribution of individual accident variables to accident results, based on the accurate reconstruction of traffic accidents using PC-Crash’s Multi-Body, which is an accident reconstruction program, and simulation of each scenario. Multi-Body system of PC-Crash accident reconstruction program is used for multi-body accident reconstruction that shows motions in diverse directions that were not approached previously. MB System is to design and reproduce a form of body, which shows realistic motions, using several bodies. Targeting the 'freight truck cargo drop accident around the Changwon Tunnel' that happened in November 2017, this study conducted a simulation of the freight truck cargo drop accident and analyzed the contribution of individual accident majors. Then on the basis of the driving speed, cargo load, and stacking method, six scenarios were devised. The simulation analysis result displayed that the freight car was driven at a speed of 118km/h(speed limit: 70km/h) right before the accident, carried 196 oil containers with a weight of 7,880kg (maximum load: 4,600kg) and was not fully equipped with anchoring equipment that could prevent a drop of cargo. The vehicle speed, cargo load, and cargo anchoring equipment were major accident variables, and the accident contribution analysis results of individual variables are as follows. When the freight car only obeyed the speed limit, the scattering distance of oil containers decreased by 15%, and the number of dropped oil containers decreased by 39%. When the freight car only obeyed the cargo load, the scattering distance of oil containers decreased by 5%, and the number of dropped oil containers decreased by 34%. When the freight car obeyed both the speed limit and cargo load, the scattering distance of oil containers fell by 38%, and the number of dropped oil containers fell by 64%. The analysis result of each scenario revealed that the overspeed and excessive cargo load of the freight car contributed to the dispersion of accident damage; in the case of a truck, which did not allow a fall of cargo, there was a different type of accident when driven too fast and carrying excessive cargo load, and when the freight car obeyed the speed limit and cargo load, there was the lowest possibility of causing an accident.Keywords: accident reconstruction, large-scale traffic accident, PC-Crash, MB system
Procedia PDF Downloads 20011785 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe
Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis
Abstract:
The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM
Procedia PDF Downloads 42611784 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework
Authors: Abbas Raza Ali
Abstract:
Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation
Procedia PDF Downloads 17511783 Socio-Economic Effects of Micro-Credit on Small-Scale Poultry Farmers’ Livelihood in Ado Odo-Ota Local Government Area of Ogun State, Nigeria
Authors: E. O. Fakoya, B. G. Abiona, W. O. Oyediran, A. M. Omoare
Abstract:
This study examined the socio-economic effects of micro-credit on small scale poultry farmers’ livelihood in Ado Odo-Ota Local Government area of Ogun State. Purposive sampling method was used to select eighty (80) small scale poultry farmers that benefited in micro credit. Interview guide was used to obtain information on the respondents’ socio-economic characteristic, sources of micro-credit and the effects of micro-credit on their livelihood. The results revealed that most of the respondents (77.50 %) were males while half (40.00%) of the respondents were between the ages of 31-40 years. A high proportion (72.50%) of the respondents had formal education. The major sources of micro credit to small scale poultry farmers were cooperative society (47.50%) and personal savings (20.00%). The findings also revealed that micro-credit had positive effect on the assets and livelihoods of small scale poultry farmers’ livelihood. Results of t-test analysis showed a significant difference between the effects before and after micro-credit on small-scale poultry farmers’ Livelihood at p < 0.05. The study recommends that formal lending institution should be given necessary support by government to enable poultry farmers have access to credit facilities in the study area.Keywords: micro-credit, effects, livelihood, poultry farmers, socio-economic, small scale
Procedia PDF Downloads 44211782 The Challenges of Teaching First Year Accounting with a Lecturer-Student Ratio of 1:1248
Authors: Hanli Joubert
Abstract:
In South Africa, teaching large classes is a reality that lecturers face in most higher institutions. When teaching a large group, literature normally refers to groups of about 50 to 500 students. At the University of the Free State, the first-year accounting group comprises around 1300 students. Apart from extremely large classes, the problem is exacerbated by the diversity of students’ previous schooling in accounting as well as their socio-economic backgrounds. The university scenario is further complicated by a lack of venues, compressed timetables, as well as lack of resources. This study aims to investigate the challenges and effectiveness of teaching a large and diverse group of first-year accounting students by drawing from personal experience, a literature study, interviews with other lecturers as well as students registered for first year accounting. The results reveal that teaching first-year accounting students in a large group is not the ideal situation but that it can be effective if it is managed correctly.Keywords: diverse backgrounds, large groups, limited resources, first-year accounting students
Procedia PDF Downloads 5411781 Enhancing the Bionic Eye: A Real-time Image Optimization Framework to Encode Color and Spatial Information Into Retinal Prostheses
Authors: William Huang
Abstract:
Retinal prostheses are currently limited to low resolution grayscale images that lack color and spatial information. This study develops a novel real-time image optimization framework and tools to encode maximum information to the prostheses which are constrained by the number of electrodes. One key idea is to localize main objects in images while reducing unnecessary background noise through region-contrast saliency maps. A novel color depth mapping technique was developed through MiniBatchKmeans clustering and color space selection. The resulting image was downsampled using bicubic interpolation to reduce image size while preserving color quality. In comparison to current schemes, the proposed framework demonstrated better visual quality in tested images. The use of the region-contrast saliency map showed improvements in efficacy up to 30%. Finally, the computational speed of this algorithm is less than 380 ms on tested cases, making real-time retinal prostheses feasible.Keywords: retinal implants, virtual processing unit, computer vision, saliency maps, color quantization
Procedia PDF Downloads 15211780 Large-scale GWAS Investigating Genetic Contributions to Queerness Will Decrease Stigma Against LGBTQ+ Communities
Authors: Paul J. McKay
Abstract:
Large-scale genome-wide association studies (GWAS) investigating genetic contributions to sexual orientation and gender identity are largely lacking and may reduce stigma experienced in the LGBTQ+ community by providing an underlying biological explanation for queerness. While there is a growing consensus within the scientific community that genetic makeup contributes – at least in part – to sexual orientation and gender identity, there is a marked lack of genomics research exploring polygenic contributions to queerness. Based on recent (2019) findings from a large-scale GWAS investigating the genetic architecture of same-sex sexual behavior, and various additional peer-reviewed publications detailing novel insights into the molecular mechanisms of sexual orientation and gender identity, we hypothesize that sexual orientation and gender identity are complex, multifactorial, and polygenic; meaning that many genetic factors contribute to these phenomena, and environmental factors play a possible role through epigenetic modulation. In recent years, large-scale GWAS studies have been paramount to our modern understanding of many other complex human traits, such as in the case of autism spectrum disorder (ASD). Despite possible benefits of such research, including reduced stigma towards queer people, improved outcomes for LGBTQ+ in familial, socio-cultural, and political contexts, and improved access to healthcare (particularly for trans populations); important risks and considerations remain surrounding this type of research. To mitigate possibilities such as invalidation of the queer identities of existing LGBTQ+ individuals, genetic discrimination, or the possibility of euthanasia of embryos with a genetic predisposition to queerness (through reproductive technologies like IVF and/or gene-editing in utero), we propose a community-engaged research (CER) framework which emphasizes the privacy and confidentiality of research participants. Importantly, the historical legacy of scientific research attempting to pathologize queerness (in particular, falsely equating gender variance to mental illness) must be acknowledged to ensure any future research conducted in this realm does not propagate notions of homophobia, transphobia or stigma against queer people. Ultimately, in a world where same-sex sexual activity is criminalized in 69 UN member states, with 67 of these states imposing imprisonment, 8 imposing public flogging, 6 (Brunei, Iran, Mauritania, Nigeria, Saudi Arabia, Yemen) invoking the death penalty, and another 5 (Afghanistan, Pakistan, Qatar, Somalia, United Arab Emirates) possibly invoking the death penalty, the importance of this research cannot be understated, as finding a biological basis for queerness would directly oppose the harmful rhetoric that “being LGBTQ+ is a choice.” Anti-trans legislation is similarly widespread: In the United States in 2022 alone (as of Oct. 13), 155 anti-trans bills have been introduced preventing trans girls and women from playing on female sports teams, barring trans youth from using bathrooms and locker rooms that align with their gender identity, banning access to gender affirming medical care (e.g., hormone-replacement therapy, gender-affirming surgeries), and imposing legal restrictions on name changes. Understanding that a general lack of knowledge about the biological basis of queerness may be a contributing factor to the societal stigma faced by gender and sexual orientation minorities, we propose the initiation of large-scale GWAS studies investigating the genetic basis of gender identity and sexual orientation.Keywords: genome-wide association studies (GWAS), sexual and gender minorities (SGM), polygenicity, community-engaged research (CER)
Procedia PDF Downloads 6911779 Production and Distribution Network Planning Optimization: A Case Study of Large Cement Company
Authors: Lokendra Kumar Devangan, Ajay Mishra
Abstract:
This paper describes the implementation of a large-scale SAS/OR model with significant pre-processing, scenario analysis, and post-processing work done using SAS. A large cement manufacturer with ten geographically distributed manufacturing plants for two variants of cement, around 400 warehouses serving as transshipment points, and several thousand distributor locations generating demand needed to optimize this multi-echelon, multi-modal transport supply chain separately for planning and allocation purposes. For monthly planning as well as daily allocation, the demand is deterministic. Rail and road networks connect any two points in this supply chain, creating tens of thousands of such connections. Constraints include the plant’s production capacity, transportation capacity, and rail wagon batch size constraints. Each demand point has a minimum and maximum for shipments received. Price varies at demand locations due to local factors. A large mixed integer programming model built using proc OPTMODEL decides production at plants, demand fulfilled at each location, and the shipment route to demand locations to maximize the profit contribution. Using base SAS, we did significant pre-processing of data and created inputs for the optimization. Using outputs generated by OPTMODEL and other processing completed using base SAS, we generated several reports that went into their enterprise system and created tables for easy consumption of the optimization results by operations.Keywords: production planning, mixed integer optimization, network model, network optimization
Procedia PDF Downloads 6611778 Scale-Up Process for Phyllanthus niruri Enriched Extract by Supercritical Fluid Extraction
Authors: Norsyamimi Hassim, Masturah Markom
Abstract:
Supercritical fluid extraction (SFE) has been known as a sustainable and safe extraction technique for plant extraction due to the minimal usage of organic solvent. In this study, a scale-up process for the selected herbal plant (Phyllanthus niruri) was investigated by using supercritical carbon dioxide (SC-CO2) with food-grade (ethanol-water) cosolvent. The quantification of excess ethanol content in the final dry extracts was conducted to determine the safety of enriched extracts. The extraction yields obtained by scale-up SFE unit were not much different compared to the predicted extraction yields with an error of 2.92%. For component contents, the scale-up extracts showed comparable quality with laboratory-scale experiments. The final dry extract showed that the excess ethanol content was 1.56% g/g extract. The fish embryo toxicity test (FETT) on the zebrafish embryos showed no toxicity effects by the extract, where the LD50 value was found to be 505.71 µg/mL. Thus, it has been proven that SFE with food-grade cosolvent is a safe extraction technique for the production of bioactive compounds from P. niruri.Keywords: scale-up, supercritical fluid extraction, enriched extract, toxicity, ethanol content
Procedia PDF Downloads 13211777 Tapered Double Cantilever Beam: Evaluation of the Test Set-up for Self-Healing Polymers
Authors: Eleni Tsangouri, Xander Hillewaere, David Garoz Gómez, Dimitrios Aggelis, Filip Du Prez, Danny Van Hemelrijck
Abstract:
Tapered Double Cantilever Beam (TDCB) is the most commonly used test set-up to evaluate the self-healing feature of thermoset polymers autonomously activated in the presence of crack. TDCB is a modification of the established fracture mechanics set-up of Double Cantilever Beam and is designed to provide constant strain energy release rate with crack length under stable load evolution (mode-I). In this study, the damage of virgin and autonomously healed TDCB polymer samples is evaluated considering the load-crack opening diagram, the strain maps provided by Digital Image Correlation technique and the fractography maps given by optical microscopy. It is shown that the pre-crack introduced prior to testing (razor blade tapping), the loading rate and the length of the side groove are the features that dominate the crack propagation and lead to inconstant fracture energy release rate.Keywords: polymers, autonomous healing, fracture, tapered double cantilever beam
Procedia PDF Downloads 35111776 Base Deficit Profiling in Patients with Isolated Blunt Traumatic Brain Injury – Correlation with Severity and Outcomes
Authors: Shahan Waheed, Muhammad Waqas, Asher Feroz
Abstract:
Objectives: To determine the utility of base deficit in traumatic brain injury in assessing the severity and to correlate with the conventional computed tomography scales in grading the severity of head injury. Methodology: Observational cross-sectional study conducted in a tertiary care facility from 1st January 2010 to 31st December 2012. All patients with isolated traumatic brain injury presenting within 24 hours of the injury to the emergency department were included in the study. Initial Glasgow Coma Scale and base deficit values were taken at presentation, the patients were followed during their hospital stay and CT scan brain findings were recorded and graded as per the Rotterdam scale, the findings were cross-checked by a radiologist, Glasgow Outcome Scale was taken on last follow up. Outcomes were dichotomized into favorable and unfavorable outcomes. Continuous variables with normal and non-normal distributions are reported as mean ± SD. Categorical variables are presented as frequencies and percentages. Relationship of the base deficit with GCS, GOS, CT scan brain and length of stay was calculated using Spearman`s correlation. Results: 154 patients were enrolled in the study. Mean age of the patients were 30 years and 137 were males. The severity of brain injuries as per the GCS was 34 moderate and 109 severe respectively. 34 percent of the total has an unfavorable outcome with a mean of 18±14. The correlation was significant at the 0.01 level with GCS on presentation and the base deficit 0.004. The correlation was not significant between the Rotterdam CT scan brain findings, length of stay and the base deficit. Conclusion: The base deficit was found to be a good predictor of severity of brain injury. There was no association of the severity of injuries on the CT scan brain as per the Rotterdam scale and the base deficit. Further studies with large sample size are needed to further evaluate the associations.Keywords: base deficit, traumatic brain injury, Rotterdam, GCS
Procedia PDF Downloads 44311775 Green Organic Chemistry, a New Paradigm in Pharmaceutical Sciences
Authors: Pesaru Vigneshwar Reddy, Parvathaneni Pavan
Abstract:
Green organic chemistry which is the latest and one of the most researched topics now-a- days has been in demand since 1990’s. Majority of the research in green organic chemistry chemicals are some of the important starting materials for greater number of major chemical industries. The production of organic chemicals has raw materials (or) reagents for other application is major sector of manufacturing polymers, pharmaceuticals, pesticides, paints, artificial fibers, food additives etc. organic synthesis on a large scale compound to the labratory scale, involves the use of energy, basic chemical ingredients from the petro chemical sectors, catalyst and after the end of the reaction, seperation, purification, storage, packing distribution etc. During these processes there are many problems of health and safety for workers in addition to the environmental problems caused there by use and deposition as waste. Green chemistry with its 12 principles would like to see changes in conventional way that were used for decades to make synthetic organic chemical and the use of less toxic starting materials. Green chemistry would like to increase the efficiency of synthetic methods, to use less toxic solvents, reduce the stage of synthetic routes and minimize waste as far as practically possible. In this way, organic synthesis will be part of the effort for sustainable development Green chemistry is also interested for research and alternatives innovations on many practical aspects of organic synthesis in the university and research labaratory of institutions. By changing the methodologies of organic synthesis, health and safety will be advanced in the small scale laboratory level but also will be extended to the industrial large scale production a process through new techniques. The three key developments in green chemistry include the use of super critical carbondioxide as green solvent, aqueous hydrogen peroxide as an oxidising agent and use of hydrogen in asymmetric synthesis. It also focuses on replacing traditional methods of heating with that of modern methods of heating like microwaves traditions, so that carbon foot print should reduces as far as possible. Another beneficiary of this green chemistry is that it will reduce environmental pollution through the use of less toxic reagents, minimizing of waste and more bio-degradable biproducts. In this present paper some of the basic principles, approaches, and early achievements of green chemistry has a branch of chemistry that studies the laws of passing of chemical reactions is also considered, with the summarization of green chemistry principles. A discussion about E-factor, old and new synthesis of ibuprofen, microwave techniques, and some of the recent advancements also considered.Keywords: energy, e-factor, carbon foot print, micro-wave, sono-chemistry, advancement
Procedia PDF Downloads 30611774 Multiple Fusion Based Single Image Dehazing
Authors: Joe Amalraj, M. Arunkumar
Abstract:
Haze is an atmospheric phenomenon that signicantly degrades the visibility of outdoor scenes. This is mainly due to the atmosphere particles that absorb and scatter the light. This paper introduces a novel single image approach that enhances the visibility of such degraded images. In this method is a fusion-based strategy that derives from two original hazy image inputs by applying a white balance and a contrast enhancing procedure. To blend effectively the information of the derived inputs to preserve the regions with good visibility, we filter their important features by computing three measures (weight maps): luminance, chromaticity, and saliency. To minimize artifacts introduced by the weight maps, our approach is designed in a multiscale fashion, using a Laplacian pyramid representation. This paper demonstrates the utility and effectiveness of a fusion-based technique for de-hazing based on a single degraded image. The method performs in a per-pixel fashion, which is straightforward to implement. The experimental results demonstrate that the method yields results comparative to and even better than the more complex state-of-the-art techniques, having the advantage of being appropriate for real-time applications.Keywords: single image de-hazing, outdoor images, enhancing, DSP
Procedia PDF Downloads 41011773 Reducing the Risk of Alcohol Relapse after Liver-Transplantation
Authors: Rebeca V. Tholen, Elaine Bundy
Abstract:
Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving an LT. Methods: The HRAR Scale is a predictive tool designed to determine the severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients. (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients, and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving a LT. Methods: The HRAR Scale is a predictive tool designed to determine severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients.Keywords: alcoholism, liver transplant, quality improvement, substance abuse
Procedia PDF Downloads 11611772 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster
Authors: Trapti Sharma, Devesh Kumar Srivastava
Abstract:
This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.Keywords: hadoop, mapreduce, k-mediod, validation, verification
Procedia PDF Downloads 36911771 Development and Evaluation of a Psychological Adjustment and Adaptation Status Scale for Breast Cancer Survivors
Authors: Jing Chen, Jun-E Liu, Peng Yue
Abstract:
Objective: The objective of this study was to develop a psychological adjustment and adaptation status scale for breast cancer survivors, and to examine the reliability and validity of the scale. Method: 37 breast cancer survivors were recruited in qualitative research; a five-subject theoretical framework and an item pool of 150 items of the scale were derived from the interview data. In order to evaluate and select items and reach a preliminary validity and reliability for the original scale, the suggestions of study group members, experts and breast cancer survivors were taken, and statistical methods were used step by step in a sample of 457 breast cancer survivors. Results: An original 24-item scale was developed. The five dimensions “domestic affections”, “interpersonal relationship”, “attitude of life”, “health awareness”, “self-control/self-efficacy” explained 58.053% of the total variance. The content validity was assessed by experts, the CVI was 0.92. The construct validity was examined in a sample of 264 breast cancer survivors. The fitting indexes of confirmatory factor analysis (CFA) showed good fitting of the five dimensions model. The criterion-related validity of the total scale with PTGI was satisfactory (r=0.564, p<0.001). The internal consistency reliability and test-retest reliability were tested. Cronbach’s alpha value (0.911) showed a good internal consistency reliability, and the intraclass correlation coefficient (ICC=0.925, p<0.001) showed a satisfactory test-retest reliability. Conclusions: The scale was brief and easy to understand, was suitable for breast cancer patients whose physical strength and energy were limited.Keywords: breast cancer survivors, rehabilitation, psychological adaption and adjustment, development of scale
Procedia PDF Downloads 51311770 Comparison of FASTMAP and B0 Field Map Shimming for 4T MRI
Authors: Mohan L. Jayatiake, Judd Storrs, Jing-Huei Lee
Abstract:
The optimal MRI resolution relies on a homogeneous magnetic field. However, local susceptibility variations can lead to field inhomogeneities that cause artifacts such as image distortion and signal loss. The effects of local susceptibility variation notoriously increase with magnetic field strength. Active shimming improves homogeneity by applying corrective fields generated from shim coils, but requires calculation of optimal current for each shim coil. FASTMAP (fast automatic shimming technique by mapping along projections) is an effective technique for finding optimal currents works well at high-field, but is restricted to shimming spherical regions of interest. The 3D gradient-echo pulse sequence was modified to reduce sensitivity to eddy currents and used to obtain susceptibility field maps at 4T. Measured fields were projected onto first-and second-order spherical harmonic functions corresponding to shim hardware. A spherical phantom was used to calibrate the shim currents. Susceptibility maps of a volunteer’s brain with and without FASTMAP shimming were obtained. Simulations indicate that optimal shim currents derived from the field map may provide better overall shimming of the human brain.Keywords: shimming, high-field, active, passive
Procedia PDF Downloads 509