Search results for: generating subsystem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1113

Search results for: generating subsystem

663 Challenge of Baseline Hydrology Estimation at Large-Scale Watersheds

Authors: Can Liu, Graham Markowitz, John Balay, Ben Pratt

Abstract:

Baseline or natural hydrology is commonly employed for hydrologic modeling and quantification of hydrologic alteration due to manmade activities. It can inform planning and policy related efforts for various state and federal water resource agencies to restore natural streamflow flow regimes. A common challenge faced by hydrologists is how to replicate unaltered streamflow conditions, particularly in large watershed settings prone to development and regulation. Three different methods were employed to estimate baseline streamflow conditions for 6 major subbasins the Susquehanna River Basin; those being: 1) incorporation of consumptive water use and reservoir operations back into regulated gaged records; 2) using a map correlation method and flow duration (exceedance probability) regression equations; 3) extending the pre-regulation streamflow records based on the relationship between concurrent streamflows at unregulated and regulated gage locations. Parallel analyses were perform among the three methods and limitations associated with each are presented. Results from these analyses indicate that generating baseline streamflow records at large-scale watersheds remain challenging, even with long-term continuous stream gage records available.

Keywords: baseline hydrology, streamflow gage, subbasin, regression

Procedia PDF Downloads 324
662 Analog Input Output Buffer Information Specification Modelling Techniques for Single Ended Inter-Integrated Circuit and Differential Low Voltage Differential Signaling I/O Interfaces

Authors: Monika Rawat, Rahul Kumar

Abstract:

Input output Buffer Information Specification (IBIS) models are used for describing the analog behavior of the Input Output (I/O) buffers of a digital device. They are widely used to perform signal integrity analysis. Advantages of using IBIS models include simple structure, IP protection and fast simulation time with reasonable accuracy. As design complexity of driver and receiver increases, capturing exact behavior from transistor level model into IBIS model becomes an essential task to achieve better accuracy. In this paper, an improvement in existing methodology of generating IBIS model for complex I/O interfaces such as Inter-Integrated Circuit (I2C) and Low Voltage Differential Signaling (LVDS) is proposed. Furthermore, the accuracy and computational performance of standard method and proposed approach with respect to SPICE are presented. The investigations will be useful to further improve the accuracy of IBIS models and to enhance their wider acceptance.

Keywords: IBIS, signal integrity, open-drain buffer, low voltage differential signaling, behavior modelling, transient simulation

Procedia PDF Downloads 196
661 The Conservatoire Crisis: An Exploration into the Lived Experiences of Conservatoire Graduates

Authors: Scott Caizley

Abstract:

Widening participation amongst state schooled and British and Minority Ethnic (BME) students in UK conservatoires throughout the past years has persisted to remain at an all time low despite major efforts to increase access for those from underrepresented backgrounds. In the academic year of 2017/18, two of the UK’s leading music conservatoires recruited less state school students than Oxbridge. Whilst conservatories face further public stigmatisation and heavy financial penalties for failing to meet government benchmarks; there appears to be a more costly outcome to this crisis. This of course, is the lack of sociocultural diversity, which is perpetuated both within the conservatoire sector and the classical music industry. This research investigates the lived experiences of former state-schooled students who attended a UK music conservatoire. Given the participant’s underrepresented status, the research seeks to answer whether or not the students are fitting in or standing out within the conservatoire environment. The research will explore the findings through a Bourdieusian contextual framework with hope of generating a wealth of new practises to the field of Higher Music Education. It is through illuminating the underrepresented voices within these elite spaces, which could aid future research and policy to help tackle the diversity dilemma and give classical music the social and cultural renewal it so desperately needs.

Keywords: classical music, lived experiences, higher music education, Bourdieusian

Procedia PDF Downloads 133
660 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases

Authors: Daniel C. Bonzo

Abstract:

Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.

Keywords: clustered data, estimand, extrapolation, mixed model

Procedia PDF Downloads 136
659 The Impact of the General Data Protection Regulation on Human Resources Management in Schools

Authors: Alexandra Aslanidou

Abstract:

The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.

Keywords: general data protection regulation, human resource management, educational system

Procedia PDF Downloads 100
658 Exploring the Dualistic Nature of Design: Integrative Perspectives and Methodological Approaches in Design Research

Authors: Joni Agung Sudarmanto

Abstract:

The concept of design has historically been elusive and characterized by its fluidity, leading to divergent viewpoints on its fundamental nature. Guy Julier views design as inherent in material culture, while Sanders sees it as a collective endeavor focusing on the outcome. Design's dualistic nature, procedural and outcome-oriented, spans various domains, including objects, individuals, and the environment. This comprehensive view of design challenges the notion that design practice is distinct from research, highlighting their shared exploratory nature. The article explores methodological techniques in design research and the three prevalent approaches: "into design," "through design," and "for design." The contradictory meanings of design arise from its etymology and its duality as both process and result, leading to its integrative nature across objects, humans, and the environment. The parallels between design and research activities, underscoring their exploratory and knowledge-generating nature, are situated within creative research, challenging the perception of design practice as separate from research endeavors. The "into design" approach encourages interdisciplinary collaboration, enriching design research with diverse perspectives. The "through design" approach bridges theory and practice, producing more practical outcomes. The "for design" approach supports specific design solutions, providing designers with valuable guidance.

Keywords: dualistic nature of design, integrative perspectives, methodological approaches, design research

Procedia PDF Downloads 71
657 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage

Authors: P. Jayashree, S. Rajkumar

Abstract:

With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.

Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding

Procedia PDF Downloads 294
656 Runoff Estimation Using NRCS-CN Method

Authors: E. K. Naseela, B. M. Dodamani, Chaithra Chandran

Abstract:

The GIS and remote sensing techniques facilitate accurate estimation of surface runoff from watershed. In the present study an attempt has been made to evaluate the applicability of Natural Resources Service Curve Number method using GIS and Remote sensing technique in the upper Krishna basin (69,425 Sq.km). Landsat 7 (with resolution 30 m) satellite data for the year 2012 has been used for the preparation of land use land cover (LU/LC) map. The hydrologic soil group is mapped using GIS platform. The weighted curve numbers (CN) for all the 5 subcatchments calculated on the basis of LU/LC type and hydrologic soil class in the area by considering antecedent moisture condition. Monthly rainfall data was available for 58 raingauge stations. Overlay technique is adopted for generating weighted curve number. Results of the study show that land use changes determined from satellite images are useful in studying the runoff response of the basin. The results showed that there is no significant difference between observed and estimated runoff depths. For each subcatchment, statistically positive correlations were detected between observed and estimated runoff depth (0.6Keywords: curve number, GIS, remote sensing, runoff

Procedia PDF Downloads 539
655 Absent Theaters: A Virtual Reconstruction from Memories

Authors: P. Castillo Muñoz, A. Lara Ramírez

Abstract:

Absent Theaters is a project that virtually reconstructs three theaters that existed in the twentieth century, demolished in the city of Medellin, Colombia: Circo España, Bolívar, and Junín. Virtual reconstruction is used as an excuse to talk with those who lived in their childhood and youth cultural spaces that formed a whole generation. Around 100 people who witnessed these theaters were interviewed. The means used to perform the oral history work was the virtual reconstruction of the interior of the theaters that were presented to the interviewees through the Virtual Reality glasses. The voices of people between 60 and 103 years old were used to generate a transmission of knowledge to the new generations about the importance of theaters as essential places for the city, as spaces generating social relations and knowledge of other cultures. Oral stories about events, the historical and social context of the city, were mixed with archive images and animations of the architectural transformations of these places. Oral stories about events, the historical and social context of the city, were mixed with archive images and animations of the architectural transformations of these places, with the purpose of compiling a collective discourse around cultural activities, heritage, and memory of Medellin.

Keywords: culture, heritage, oral history, theaters, virtual reality

Procedia PDF Downloads 133
654 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment

Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg

Abstract:

Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.

Keywords: building information, modelling, BIM, genetic algorithm, GA, architecture-engineering-construction, AEC, optimisation, structure, design, population, generation, selection, mutation, crossover, offspring

Procedia PDF Downloads 240
653 Overcoming the Obstacles to Green Campus Implementation in Indonesia

Authors: Mia Wimala, Emma Akmalah, Ira Irawati, M. Rangga Sururi

Abstract:

One way that has been aggressively implemented in creating a sustainable environment nowadays is through the implementation of green building concept. In order to ensure the success of its implementation, the support and initiation from educational institutions, especially higher education institutions are indispensable. This research was conducted to figure out the obstacles restraining the success of green campus implementation in Indonesia, as well as to propose strategies to overcome those obstacles. The data presented in this paper are mainly derived from interview and questionnaire distributed randomly to the staffs and students in 10 (ten) major institutions around Jakarta and West Java area. The data were further analyzed using ANOVA and SWOT analysis. According to 182 respondents, it is found that resistance to change, inadequate knowledge, information and understanding, no penalty for any environmental violation, lack of reward for green campus practices, lack of stringent regulations/laws, lack of management commitment, insufficient funds are the obstacles to the green campus movement in Indonesia. In addition, out of 6 criteria considered in UI GreenMetric World Ranking, education was the only criteria that had no significant difference between public and private universities in generating the green campus performance. The work concludes with recommendation of strategies to improve the implementation of green campus in the future.

Keywords: green campus, obstacles, sustainable, higher education institutions

Procedia PDF Downloads 224
652 Non-Parametric Changepoint Approximation for Road Devices

Authors: Loïc Warscotte, Jehan Boreux

Abstract:

The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.

Keywords: changepoint, weigh-in-motion, process, non-parametric

Procedia PDF Downloads 78
651 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 413
650 Application of GIS-Based Construction Engineering: An Electronic Document Management System

Authors: Mansour N. Jadid

Abstract:

This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.

Keywords: construction, coordinate, engineering, GIS, management, map

Procedia PDF Downloads 303
649 An Automatic Generating Unified Modelling Language Use Case Diagram and Test Cases Based on Classification Tree Method

Authors: Wassana Naiyapo, Atichat Sangtong

Abstract:

The processes in software development by Object Oriented methodology have many stages those take time and high cost. The inconceivable error in system analysis process will affect to the design and the implementation process. The unexpected output causes the reason why we need to revise the previous process. The more rollback of each process takes more expense and delayed time. Therefore, the good test process from the early phase, the implemented software is efficient, reliable and also meet the user’s requirement. Unified Modelling Language (UML) is the tool which uses symbols to describe the work process in Object Oriented Analysis (OOA). This paper presents the approach for automatically generated UML use case diagram and test cases. UML use case diagram is generated from the event table and test cases are generated from use case specifications and Graphic User Interfaces (GUI). Test cases are derived from the Classification Tree Method (CTM) that classify data to a node present in the hierarchy structure. Moreover, this paper refers to the program that generates use case diagram and test cases. As the result, it can reduce work time and increase efficiency work.

Keywords: classification tree method, test case, UML use case diagram, use case specification

Procedia PDF Downloads 162
648 Solubility of Water in CO2 Mixtures at Pipeline Operation Conditions

Authors: Mohammad Ahmad, Sander Gersen, Erwin Wilbers

Abstract:

Carbon capture, transport and underground storage have become a major solution to reduce CO2 emissions from power plants and other large CO2 sources. A big part of this captured CO2 stream is transported at high pressure dense phase conditions and stored in offshore underground depleted oil and gas fields. CO2 is also transported in offshore pipelines to be used for enhanced oil and gas recovery. The captured CO2 stream with impurities may contain water that causes severe corrosion problems, flow assurance failure and might damage valves and instrumentations. Thus, free water formation should be strictly prevented. The purpose of this work is to study the solubility of water in pure CO2 and in CO2 mixtures under real pipeline pressure (90-150 bar) and temperature operation conditions (5-35°C). A set up was constructed to generate experimental data. The results show the solubility of water in CO2 mixtures increasing with the increase of the temperature or/and with the increase in pressure. A drop in water solubility in CO2 is observed in the presence of impurities. The data generated were then used to assess the capabilities of two mixture models: the GERG-2008 model and the EOS-CG model. By generating the solubility data, this study contributes to determine the maximum allowable water content in CO2 pipelines.

Keywords: carbon capture and storage, water solubility, equation of states, fluids engineering

Procedia PDF Downloads 301
647 Irradion: Portable Small Animal Imaging and Irradiation Unit

Authors: Josef Uher, Jana Boháčová, Richard Kadeřábek

Abstract:

In this paper, we present a multi-robot imaging and irradiation research platform referred to as Irradion, with full capabilities of portable arbitrary path computed tomography (CT). Irradion is an imaging and irradiation unit entirely based on robotic arms for research on cancer treatment with ion beams on small animals (mice or rats). The platform comprises two subsystems that combine several imaging modalities, such as 2D X-ray imaging, CT, and particle tracking, with precise positioning of a small animal for imaging and irradiation. Computed Tomography: The CT subsystem of the Irradion platform is equipped with two 6-joint robotic arms that position a photon counting detector and an X-ray tube independently and freely around the scanned specimen and allow image acquisition utilizing computed tomography. Irradiation measures nearly all conventional 2D and 3D trajectories of X-ray imaging with precisely calibrated and repeatable geometrical accuracy leading to a spatial resolution of up to 50 µm. In addition, the photon counting detectors allow X-ray photon energy discrimination, which can suppress scattered radiation, thus improving image contrast. It can also measure absorption spectra and recognize different materials (tissue) types. X-ray video recording and real-time imaging options can be applied for studies of dynamic processes, including in vivo specimens. Moreover, Irradion opens the door to exploring new 2D and 3D X-ray imaging approaches. We demonstrate in this publication various novel scan trajectories and their benefits. Proton Imaging and Particle Tracking: The Irradion platform allows combining several imaging modules with any required number of robots. The proton tracking module comprises another two robots, each holding particle tracking detectors with position, energy, and time-sensitive sensors Timepix3. Timepix3 detectors can track particles entering and exiting the specimen and allow accurate guiding of photon/ion beams for irradiation. In addition, quantifying the energy losses before and after the specimen brings essential information for precise irradiation planning and verification. Work on the small animal research platform Irradion involved advanced software and hardware development that will offer researchers a novel way to investigate new approaches in (i) radiotherapy, (ii) spectral CT, (iii) arbitrary path CT, (iv) particle tracking. The robotic platform for imaging and radiation research developed for the project is an entirely new product on the market. Preclinical research systems with precision robotic irradiation with photon/ion beams combined with multimodality high-resolution imaging do not exist currently. The researched technology can potentially cause a significant leap forward compared to the current, first-generation primary devices.

Keywords: arbitrary path CT, robotic CT, modular, multi-robot, small animal imaging

Procedia PDF Downloads 89
646 Alternatives to the Disposal of Sludge from Water and Wastewater Treatment Plants

Authors: Lima Priscila, Gianotto Raiza, Arruda Leonan, Magalhães Filho Fernando

Abstract:

Industrialization and especially the accentuated population growth in developing countries and the lack of drainage, public cleaning, water and sanitation services has caused concern about the need for expansion of water treatment units and sewage. However, these units have been generating by-products, such as the sludge. This paper aims to investigate aspects of operation and maintenance of sludge from a wastewater treatment plant (WWTP - 90 L.s-1) and two water treatment plants (WTPs; 1.4 m3.s-1 and 0.5 m3.s-1) for the purpose of proper disposal and reuse, evaluating their qualitative and quantitative characteristics, the Brazilian legislation and standards. It was concluded that the sludge from the water treatment plants is directly related to the quality of raw water collected, and it becomes feasible for use in construction materials, and to dispose it in the sewage system, improving the efficiency of the WWTP regarding precipitation of phosphorus (35% of removal). The WTP Lageado had 55,726 kg/month of sludge production, more than WTP Guariroba (29,336 kg/month), even though the flow of WTP Guariroba is 1,400 L.s-1 and the WTP Lagedo 500 L.s-1, being explained by the quality that influences more than the flow. The WWTP sludge have higher concentrations of organic materials due to their origin and could be used to improve the fertility of the soil, crop production and recovery of degraded areas. The volume of sludge generated at the WWTP was 1,760 ton/month, with 5.6% of solid content in the raw sludge and in the dewatered sludge it increased its content to 23%.

Keywords: disposal, sludge, water treatment, wastewater treatment

Procedia PDF Downloads 320
645 Effective Governance through Mobile Phones: Cases Supporting the Introduction and Implementation

Authors: Mohd Mudasir Shafi, Zafrul Hasan, Talat Saleem

Abstract:

Information and communication Technology (ICT) services have been defined as a route to good governance. Introduction of ICT into Governance has given rise to the idea of e-governance which helps in enhancing transparency, generating accountability and responsiveness in the system in order to provide faster and quality service to the citizen. Advancement in ICT has provided governments all over the world to speed up the delivery of information and services to citizens and businesses and increase their participation in governance. There has been varying degree of success over the past decade into providing services to the citizens using internet and different web services. These e-government initiatives have been extensively researched. Our research is aimed at the transition from electronic government to mobile government (m-government) initiatives implementing the mobile services and concerned to understand the major factors which will aid to adoption and distribution of these services. There must be some amount of research done in the integration process between e-government and m-government. There must also be enough amount of investigation done all the factors that could affect the transition process. Such factors differ between different places and the advancement in information and technology available there. In this paper, we have discussed why mobile communication system can be used for effective e-governance and the areas where m-governance can be implemented. The paper will examine some of the reasons as well as the main opportunities for improving effective governance through mobile phones.

Keywords: e-governance, mobile phones, information technology, m-government

Procedia PDF Downloads 443
644 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida

Authors: N. Mehrotra, V. Ahuja, N. Sridharan

Abstract:

Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.

Keywords: disaster, resilience, system, urban

Procedia PDF Downloads 458
643 Floodplain Modeling of River Jhelum Using HEC-RAS: A Case Study

Authors: Kashif Hassan, M.A. Ahanger

Abstract:

Floods have become more frequent and severe due to effects of global climate change and human alterations of the natural environment. Flood prediction/ forecasting and control is one of the greatest challenges facing the world today. The forecast of floods is achieved by the use of hydraulic models such as HEC-RAS, which are designed to simulate flow processes of the surface water. Extreme flood events in river Jhelum , lasting from a day to few are a major disaster in the State of Jammu and Kashmir, India. In the present study HEC-RAS model was applied to two different reaches of river Jhelum in order to estimate the flood levels corresponding to 25, 50 and 100 year return period flood events at important locations and to deduce flood vulnerability of important areas and structures. The flow rates for the two reaches were derived from flood-frequency analysis of 50 years of historic peak flow data. Manning's roughness coefficient n was selected using detailed analysis. Rating Curves were also generated to serve as base for determining the boundary conditions. Calibration and Validation procedures were applied in order to ensure the reliability of the model. Sensitivity analysis was also performed in order to ensure the accuracy of Manning's n in generating water surface profiles.

Keywords: flood plain, HEC-RAS, Jhelum, return period

Procedia PDF Downloads 426
642 Influence of Optical Fluence Distribution on Photoacoustic Imaging

Authors: Mohamed K. Metwally, Sherif H. El-Gohary, Kyung Min Byun, Seung Moo Han, Soo Yeol Lee, Min Hyoung Cho, Gon Khang, Jinsung Cho, Tae-Seong Kim

Abstract:

Photoacoustic imaging (PAI) is a non-invasive and non-ionizing imaging modality that combines the absorption contrast of light with ultrasound resolution. Laser is used to deposit optical energy into a target (i.e., optical fluence). Consequently, the target temperature rises, and then thermal expansion occurs that leads to generating a PA signal. In general, most image reconstruction algorithms for PAI assume uniform fluence within an imaging object. However, it is known that optical fluence distribution within the object is non-uniform. This could affect the reconstruction of PA images. In this study, we have investigated the influence of optical fluence distribution on PA back-propagation imaging using finite element method. The uniform fluence was simulated as a triangular waveform within the object of interest. The non-uniform fluence distribution was estimated by solving light propagation within a tissue model via Monte Carlo method. The results show that the PA signal in the case of non-uniform fluence is wider than the uniform case by 23%. The frequency spectrum of the PA signal due to the non-uniform fluence has missed some high frequency components in comparison to the uniform case. Consequently, the reconstructed image with the non-uniform fluence exhibits a strong smoothing effect.

Keywords: finite element method, fluence distribution, Monte Carlo method, photoacoustic imaging

Procedia PDF Downloads 377
641 Technique and Use of Machine Readable Dictionary: In Special Reference to Hindi-Marathi Machine Translation

Authors: Milind Patil

Abstract:

Present paper is a discussion on Hindi-Marathi Morphological Analysis and generating rules for Machine Translation on the basis of Machine Readable Dictionary (MRD). This used Transformative Generative Grammar (TGG) rules to design the MRD. As per TGG rules, the suffix of a particular root word is based on its Tense, Aspect, Modality and Voice. That's why the suffix is very important for the word meanings (or root meanings). The Hindi and Marathi Language both have relation with Indo-Aryan language family. Both have been derived from Sanskrit language and their script is 'Devnagari'. But there are lots of differences in terms of semantics and grammatical level too. In Marathi, there are three genders, but in Hindi only two (Masculine and Feminine), the Natural gender is absent in Hindi. Likewise other grammatical categories also differ in their level of use. For MRD the suffixes (or Morpheme) are of particular root word for GNP (Gender, Number and Person) are based on its natural phenomena. A particular Suffix and Morphine change as per the need of person, number and gender. The design of MRD also based on this format. In first, Person, Number, Gender and Tense are key points than root words and suffix of particular Person, Number Gender (PNG). After that the inferences are drawn on the basis of rules that is (V.stem) (Pre.T/Past.T) (x) + (Aux-Pre.T) (x) → (V.Stem.) + (SP.TM) (X).

Keywords: MRD, TGG, stem, morph, morpheme, suffix, PNG, TAM&V, root

Procedia PDF Downloads 324
640 Transient Analysis and Mitigation of Capacitor Bank Switching on a Standalone Wind Farm

Authors: Ajibola O. Akinrinde, Andrew Swanson, Remy Tiako

Abstract:

There exist significant losses on transmission lines due to distance, as power generating stations could be located far from some isolated settlements. Standalone wind farms could be a good choice of alternative power generation for such settlements that are far from the grid due to factors of long distance or socio-economic problems. However, uncompensated wind farms consume reactive power since wind turbines are induction generators. Therefore, capacitor banks are used to compensate reactive power, which in turn improves the voltage profile of the network. Although capacitor banks help improving voltage profile, they also undergo switching actions due to its compensating response to the variation of various types of load at the consumer’s end. These switching activities could cause transient overvoltage on the network, jeopardizing the end-life of other equipment on the system. In this paper, the overvoltage caused by these switching activities is investigated using the IEEE bus 14-network to represent a standalone wind farm, and the simulation is done using ATP/EMTP software. Scenarios involving the use of pre-insertion resistor and pre-insertion inductor, as well as controlled switching was also carried out in order to decide the best mitigation option to reduce the overvoltage.

Keywords: capacitor banks, IEEE bus 14-network, pre-insertion resistor, standalone wind farm

Procedia PDF Downloads 441
639 Stack Overflow Detection and Prevention on Operating Systems Using Machine Learning and Control-Flow Enforcement Technology

Authors: Cao Jiayu, Lan Ximing, Huang Jingjia, Burra Venkata Durga Kumar

Abstract:

The first virus to attack personal computers was born in early 1986, called C-Brain, written by a pair of Pakistani brothers. In those days, people still used dos systems, manipulating computers with the most basic command lines. In the 21st century today, computer performance has grown geometrically. But computer viruses are also evolving and escalating. We never stop fighting against security problems. Stack overflow is one of the most common security vulnerabilities in operating systems. It may result in serious security issues for an operating system if a program in it has a vulnerability with administrator privileges. Certain viruses change the value of specific memory through a stack overflow, allowing computers to run harmful programs. This study developed a mechanism to detect and respond to time whenever a stack overflow occurs. We demonstrate the effectiveness of standard machine learning algorithms and control flow enforcement techniques in predicting computer OS security using generating suspicious vulnerability functions (SVFS) and associated suspect areas (SAS). The method can minimize the possibility of stack overflow attacks occurring.

Keywords: operating system, security, stack overflow, buffer overflow, machine learning, control-flow enforcement technology

Procedia PDF Downloads 115
638 Suppression Subtractive Hybridization Technique for Identification of the Differentially Expressed Genes

Authors: Tuhina-khatun, Mohamed Hanafi Musa, Mohd Rafii Yosup, Wong Mui Yun, Aktar-uz-Zaman, Mahbod Sahebi

Abstract:

Suppression subtractive hybridization (SSH) method is valuable tool for identifying differentially regulated genes in disease specific or tissue specific genes important for cellular growth and differentiation. It is a widely used method for separating DNA molecules that distinguish two closely related DNA samples. SSH is one of the most powerful and popular methods for generating subtracted cDNA or genomic DNA libraries. It is based primarily on a suppression polymerase chain reaction (PCR) technique and combines normalization and subtraction in a solitary procedure. The normalization step equalizes the abundance of DNA fragments within the target population, and the subtraction step excludes sequences that are common to the populations being compared. This dramatically increases the probability of obtaining low-abundance differentially expressed cDNAs or genomic DNA fragments and simplifies analysis of the subtracted library. SSH technique is applicable to many comparative and functional genetic studies for the identification of disease, developmental, tissue specific, or other differentially expressed genes, as well as for the recovery of genomic DNA fragments distinguishing the samples under comparison.

Keywords: suppression subtractive hybridization, differentially expressed genes, disease specific genes, tissue specific genes

Procedia PDF Downloads 433
637 Challenges of Technical and Engineering Students in the Application of Scientific Cancer Knowledge to Preserve the Future Generation in Sub-Saharan Africa

Authors: K. Shaloom Mbambu, M. Pascal Tshimbalanga, K. Ruth Mutala, K. Roger Kabuya, N. Dieudonné Kabeya, Y. L. Kabeya Mukeba

Abstract:

In this article, the authors examine the even more worrying situation of girls in sub-Saharan Africa. Two-girls on five are private of Global Education, which represents a real loss to the development of communities and countries. Cultural traditions, poverty, violence, early and forced marriages, early pregnancies, and many other gender inequalities were the causes of this cancer development. Namely, "it is no more efficient development tool that is educating girls." The non-schooling of girls and their lack of supervision by liberal professions have serious consequences for the life of each of them. To improve the conditions of their inferior status, girls to men introduce poverty and health risks. Raising awareness among parents and communities on the importance of girls' education, improving children's access to school, girl-boy equality with their rights, creating income, and generating activities for girls, girls, and girls learning of liberal trades to make them self-sufficient. Organizations such as the United Nations Organization can save the children. ASEAD and the AEDA group are predicting the impact of this cancer on the development of a nation's future generation must be preserved.

Keywords: young girl, Sub-Saharan Africa, higher and vocational education, development, society, environment

Procedia PDF Downloads 254
636 Performance Analysis of PAPR Reduction in OFDM Systems based on Partial Transmit Sequence (PTS) Technique

Authors: Alcardo Alex Barakabitze, Tan Xiaoheng

Abstract:

Orthogonal Frequency Division Multiplexing (OFDM) is a special case of Multi-Carrier Modulation (MCM) technique which transmits a stream of data over a number of lower data rate subcarriers. OFDM splits the total transmission bandwidth into a number of orthogonal and non-overlapping subcarriers and transmit the collection of bits called symbols in parallel using these subcarriers. This paper explores the Peak to Average Power Reduction (PAPR) using the Partial Transmit Sequence technique. We provide the distribution analysis and the basics of OFDM signals and then show how the PAPR increases as the number of subcarriers increases. We provide the performance analysis of CCDF and PAPR expressed in decibels through MATLAB simulations. The simulation results show that, in PTS technique, the performance of PAPR reduction in OFDM systems improves significantly as the number of sub-blocks increases. However, by keeping the same number of sub-blocks variation, oversampling factor and the number of OFDM blocks’ iteration for generating the CCDF, the OFDM systems with 128 subcarriers have an improved performance in PAPR reduction compared to OFDM systems with 256, 512 or >512 subcarriers.

Keywords: OFDM, peak to average power reduction (PAPR), bit error rate (BER), subcarriers, wireless communications

Procedia PDF Downloads 514
635 Implementing Service Innovation in Public Transport Sector: Drivers and Challenges

Authors: Chaoren Lu

Abstract:

Public policy is playing as one driving force that influencing service innovation implementation in public sector. However, public policy implications cannot be automatically derived from the analyses of innovation issues, and there lacks of researches about the influences of public policy onto innovation. Moreover, innovation in service system is hard to predictable and whether policy encourages or hidden innovation is still lack of study. Especially, by given the context that multiple actors are active involving within the service delivery process in public transport sector, the complex driving forces and challenges are emerged towards the service operation. This study is aim to analysis the service innovation practices within service operating organizations to understand the drivers and challenges of service operation based on policy requirements, and where the innovation idea generating from. The case studies of Changzhou Transit Group and Nanjing Jiangnan Public Transit Group will be launched. This paper reveals the ambidexterity between top-down and bottom-up demands within the public transport service operating organizations contribute to the innovation ideas. Meanwhile, it contributes to the understanding of fundamental elements of service innovation is the new relationship creation and new way of sharing knowledge. The policy contributes to the trigger of creation of such relationship. The research question is: what are the sources of service innovation practices in local public transport system in China in in facing the policy implementation?

Keywords: public value, service innovation, public transport service, China

Procedia PDF Downloads 321
634 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 58