Search results for: building applications
2434 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.Keywords: airborne laser scanning, digital terrain models, filtering, forested areas
Procedia PDF Downloads 1392433 Simulation Aided Life Cycle Sustainability Assessment Framework for Manufacturing Design and Management
Authors: Mijoh A. Gbededo, Kapila Liyanage, Ilias Oraifige
Abstract:
Decision making for sustainable manufacturing design and management requires critical considerations due to the complexity and partly conflicting issues of economic, social and environmental factors. Although there are tools capable of assessing the combination of one or two of the sustainability factors, the frameworks have not adequately integrated all the three factors. Case study and review of existing simulation applications also shows the approach lacks integration of the sustainability factors. In this paper we discussed the development of a simulation based framework for support of a holistic assessment of sustainable manufacturing design and management. To achieve this, a strategic approach is introduced to investigate the strengths and weaknesses of the existing decision supporting tools. Investigation reveals that Discrete Event Simulation (DES) can serve as a rock base for other Life Cycle Analysis frameworks. Simio-DES application optimizes systems for both economic and competitive advantage, Granta CES EduPack and SimaPro collate data for Material Flow Analysis and environmental Life Cycle Assessment, while social and stakeholders’ analysis is supported by Analytical Hierarchy Process, a Multi-Criteria Decision Analysis method. Such a common and integrated framework creates a platform for companies to build a computer simulation model of a real system and assess the impact of alternative solutions before implementing a chosen solution.Keywords: discrete event simulation, life cycle sustainability analysis, manufacturing, sustainability
Procedia PDF Downloads 2792432 Reconstruction of Performace-Based Budgeting in Indonesian Local Government: Application of Soft Systems Methodology in Producing Guideline for Policy Implementation
Authors: Deddi Nordiawan
Abstract:
Effective public policy creation required a strong budget system, both in terms of design and implementation. Performance-based Budget is an evolutionary approach with two substantial characteristics; first, the strong integration between budgeting and planning, and second, its existence as guidance so that all activities and expenditures refer to measurable performance targets. There are four processes in the government that should be followed in order to make the budget become performance-based. These four processes consist of the preparation of a vision according to the bold aspiration, the formulation of outcome, the determination of output based on the analysis of organizational resources, and the formulation of Value Creation Map that contains a series of programs and activities. This is consistent with the concept of logic model which revealed that the budget performance should be placed within a relational framework of resources, activities, outputs, outcomes and impacts. Through the issuance of Law 17/2003 regarding State Finance, local governments in Indonesia have to implement performance-based budget. Central Government then issued Government Regulation 58/2005 which contains the detail guidelines how to prepare local governments budget. After a decade, implementation of performance budgeting in local government is still not fully meet expectations, though the guidance is completed, socialization routinely performed, and trainings have also been carried out at all levels. Accordingly, this study views the practice of performance-based budget at local governments as a problematic situation. This condition must be approached with a system approach that allows the solutions from many point of views. Based on the fact that the infrastructure of budgeting has already settled, the study then considering the situation as complexity. Therefore, the intervention needs to be done in the area of human activity system. Using Soft Systems Methodology, this research will reconstruct the process of performance-based budget at local governments is area of human activity system. Through conceptual models, this study will invite all actors (central government, local government, and the parliament) for dialogue and formulate interventions in human activity systems that systematically desirable and culturally feasible. The result will direct central government in revise the guidance to local government budgeting process as well as a reference to build the capacity building strategy.Keywords: soft systems methodology, performance-based budgeting, Indonesia, public policy
Procedia PDF Downloads 2522431 Micelles Made of Pseudo-Proteins for Solubilization of Hydrophobic Biologicals
Authors: Sophio Kobauri, David Tugushi, Vladimir P. Torchilin, Ramaz Katsarava
Abstract:
Hydrophobic / hydrophilically modified functional polymers are of high interest in modern biomedicine due to their ability to solubilize water-insoluble / poorly soluble (hydrophobic) drugs. Among the many approaches that are being developed in this direction, one of the most effective methods is the use of polymeric micelles (PMs) (micelles formed by amphiphilic block-copolymers) for solubilization of hydrophobic biologicals. For therapeutic purposes, PMs are required to be stable and biodegradable, although quite a few amphiphilic block-copolymers are described capable of forming stable micelles with good solubilization properties. For obtaining micelle-forming block-copolymers, polyethylene glycol (PEG) derivatives are desirable to use as hydrophilic shell because it represents the most popular biocompatible hydrophilic block and various hydrophobic blocks (polymers) can be attached to it. Although the construction of the hydrophobic core, due to the complex requirements and micelles structure development, is the very actual and the main problem for nanobioengineers. Considering the above, our research goal was obtaining biodegradable micelles for the solubilization of hydrophobic drugs and biologicals. For this purpose, we used biodegradable polymers– pseudo-proteins (PPs)(synthesized with naturally occurring amino acids and other non-toxic building blocks, such as fatty diols and dicarboxylic acids) as hydrophobic core since these polymers showed reasonable biodegradation rates and excellent biocompatibility. In the present study, we used the hydrophobic amino acid – L-phenylalanine (MW 4000-8000Da) instead of L-leucine. Amino-PEG (MW 2000Da) was used as hydrophilic fragments for constructing the suitable micelles. The molecular weight of PP (the hydrophobic core of micelle) was regulated by variation of used monomers ratios. Micelles were obtained by dissolving of synthesized amphiphilic polymer in water. The micelle-forming property was tested using dynamic light scattering (Malvern zetasizer NanoZSZEN3600). The study showed that obtaining amphiphilic block-copolymer form stable neutral micelles 100 ± 7 nm in size at 10mg/mL concentration, which is considered as an optimal range for pharmaceutical micelles. The obtained preliminary data allow us to conclude that the obtained micelles are suitable for the delivery of poorly water-soluble drugs and biologicals.Keywords: amino acid – L-phenylalanine, pseudo-proteins, amphiphilic block-copolymers, biodegradable micelles
Procedia PDF Downloads 1342430 A Mixing Matrix Estimation Algorithm for Speech Signals under the Under-Determined Blind Source Separation Model
Authors: Jing Wu, Wei Lv, Yibing Li, Yuanfan You
Abstract:
The separation of speech signals has become a research hotspot in the field of signal processing in recent years. It has many applications and influences in teleconferencing, hearing aids, speech recognition of machines and so on. The sounds received are usually noisy. The issue of identifying the sounds of interest and obtaining clear sounds in such an environment becomes a problem worth exploring, that is, the problem of blind source separation. This paper focuses on the under-determined blind source separation (UBSS). Sparse component analysis is generally used for the problem of under-determined blind source separation. The method is mainly divided into two parts. Firstly, the clustering algorithm is used to estimate the mixing matrix according to the observed signals. Then the signal is separated based on the known mixing matrix. In this paper, the problem of mixing matrix estimation is studied. This paper proposes an improved algorithm to estimate the mixing matrix for speech signals in the UBSS model. The traditional potential algorithm is not accurate for the mixing matrix estimation, especially for low signal-to noise ratio (SNR).In response to this problem, this paper considers the idea of an improved potential function method to estimate the mixing matrix. The algorithm not only avoids the inuence of insufficient prior information in traditional clustering algorithm, but also improves the estimation accuracy of mixing matrix. This paper takes the mixing of four speech signals into two channels as an example. The results of simulations show that the approach in this paper not only improves the accuracy of estimation, but also applies to any mixing matrix.Keywords: DBSCAN, potential function, speech signal, the UBSS model
Procedia PDF Downloads 1352429 Customer Acquisition through Time-Aware Marketing Campaign Analysis in Banking Industry
Authors: Harneet Walia, Morteza Zihayat
Abstract:
Customer acquisition has become one of the critical issues of any business in the 21st century; having a healthy customer base is the essential asset of the bank business. Term deposits act as a major source of cheap funds for the banks to invest and benefit from interest rate arbitrage. To attract customers, the marketing campaigns at most financial institutions consist of multiple outbound telephonic calls with more than one contact to a customer which is a very time-consuming process. Therefore, customized direct marketing has become more critical than ever for attracting new clients. As customer acquisition is becoming more difficult to archive, having an intelligent and redefined list is necessary to sell a product smartly. Our aim of this research is to increase the effectiveness of campaigns by predicting customers who will most likely subscribe to the fixed deposit and suggest the most suitable month to reach out to customers. We design a Time Aware Upsell Prediction Framework (TAUPF) using two different approaches, with an aim to find the best approach and technique to build the prediction model. TAUPF is implemented using Upsell Prediction Approach (UPA) and Clustered Upsell Prediction Approach (CUPA). We also address the data imbalance problem by examining and comparing different methods of sampling (Up-sampling and down-sampling). Our results have shown building such a model is quite feasible and profitable for the financial institutions. The Time Aware Upsell Prediction Framework (TAUPF) can be easily used in any industry such as telecom, automobile, tourism, etc. where the TAUPF (Clustered Upsell Prediction Approach (CUPA) or Upsell Prediction Approach (UPA)) holds valid. In our case, CUPA books more reliable. As proven in our research, one of the most important challenges is to define measures which have enough predictive power as the subscription to a fixed deposit depends on highly ambiguous situations and cannot be easily isolated. While we have shown the practicality of time-aware upsell prediction model where financial institutions can benefit from contacting the customers at the specified month, further research needs to be done to understand the specific time of the day. In addition, a further empirical/pilot study on real live customer needs to be conducted to prove the effectiveness of the model in the real world.Keywords: customer acquisition, predictive analysis, targeted marketing, time-aware analysis
Procedia PDF Downloads 1242428 Development of an Integrated Route Information Management Software
Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel
Abstract:
The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying
Procedia PDF Downloads 1482427 Cultural Landscape Planning – A Case of Chettinad Village Clusters
Authors: Adhithy Menon E., Biju C. A.
Abstract:
In the 1960s, the concept of preserving heritage monuments was first introduced. During the 1990s, the concept of cultural landscapes gained importance, highlighting the importance of culture and heritage. Throughout this paper, we examine the second category of the cultural landscape, which is an organically evolving landscape as it represents a web of tangible, intangible, and ecological heritage and the ways in which they can be rejuvenated. Cultural landscapes in various regions, such as the Chettinad Village clusters, are in serious decline, which is identified through the Heritage Passport program of this area (2007). For this reason, it is necessary to conduct a detailed analysis of the factors that contribute to this degradation to ensure its protection in the future. An analysis of the cultural landscape of the Chettinad Village clusters and its impact on the community is presented in this paper. The paper follows the first objective, which is to understand cultural landscapes and their different criteria and categories. It is preceded by the study of various methods for protecting cultural landscapes. To identify a core area of intervention based on the parameters of Cultural Landscapes and Community Based Tourism, a study and analysis of the regional context of Chettinad village clusters considering tourism development must first be conducted. Lastly, planning interventions for integrating community-based tourism in Chettinad villages for the purpose of rejuvenating the cultural landscapes of the villages as well as their communities. The major findings include the importance of the local community in protecting cultural landscapes. The parameters identified to have an impact on Chettinad Village clusters are a community (community well-being, local maintenance, and enhancement, demand, alternative income for community, public participation, awareness), tourism (location and physical access, journey time, tourist attractions), integrity (natural factors, natural disasters, demolition of structures, deterioration of materials) authenticity (sense of place, living elements, building techniques, artistic expression, religious context) disaster management (natural disasters) and environmental impact (pollution). This area can be restored to its former glory and preserved as part of the cultural landscape for future generations by focusing on and addressing these parameters within the identified core area of the Chettinad Villages cluster (Kanadukathan TP, Kothamangalam, Kottaiyur, Athangudi, Karikudi, and Palathur).Keywords: Chettinad village clusters, community, cultural landscapes, organically evolved.
Procedia PDF Downloads 822426 Surface Characterization and Femtosecond-Nanosecond Transient Absorption Dynamics of Bioconjugated Gold Nanoparticles: Insight into the Warfarin Drug-Binding Site of Human Serum Albumin
Authors: Osama K. Abou-Zied, Saba A. Sulaiman
Abstract:
We studied the spectroscopy of 25-nm diameter gold nanoparticles (AuNPs), coated with human serum albumin (HSA) as a model drug carrier. The morphology and coating of the AuNPs were examined using transmission electron microscopy and dynamic light scattering. Resonance energy transfer from the sole tryptophan of HSA (Trp214) to the AuNPs was observed in which the fluorescence quenching of Trp214 is dominated by a static mechanism. Using fluorescein (FL) to probe the warfarin drug-binding site in HSA revealed the unchanged nature of the binding cavity on the surface of the AuNPs, indicating the stability of the protein structure on the metal surface. The transient absorption results of the surface plasmonic resonance (SPR) band of the AuNPs show three ultrafast dynamics that are involved in the relaxation process after excitation at 460 nm. The three decay components were assigned to the electron-electron (~ 400 fs), electron-phonon (~ 2.0 ps) and phonon-phonon (200–250 ps) interactions. These dynamics were not changed upon coating the AuNPs with HSA which indicates the chemical and physical stability of the AuNPs upon bioconjugation. Binding of FL in HSA did not have any measurable effect on the bleach recovery dynamics of the SPR band, although both FL and AuNPs were excited at 460 nm. The current study is important for a better understanding of the physical and dynamical properties of protein-coated metal nanoparticles which are expected to help in optimizing their properties for critical applications in nanomedicine.Keywords: gold nanoparticles, human serum albumin, fluorescein, femtosecond transient absorption
Procedia PDF Downloads 3322425 Production of Cellulose Nanowhiskers from Red Algae Waste and Its Application in Polymer Composite Development
Authors: Z. Kassab, A. Aboulkas, A. Barakat, M. El Achaby
Abstract:
The red algae are available enormously around the world and their exploitation for the production of agar product has become as an important industry in recent years. However, this industrial processing of red algae generated a large quantity of solid fibrous wastes, which constitute a source of a serious environmental problem. For this reason, the exploitation of this solid waste would help to i) produce new value-added materials and ii) to improve waste disposal from environment. In fact, this solid waste can be fully utilized for the production of cellulose microfibers and nanocrystals because it consists of large amount of cellulose component. For this purpose, the red algae waste was chemically treated via alkali, bleaching and acid hydrolysis treatments with controlled conditions, in order to obtain pure cellulose microfibers and cellulose nanocrystals. The raw product and the as-extracted cellulosic materials were successively characterized using serval analysis techniques, including elemental analysis, X-ray diffraction, thermogravimetric analysis, infrared spectroscopy and transmission electron microscopy. As an application, the as extracted cellulose nanocrystals were used as nanofillers for the production of polymer-based composite films with improved thermal and tensile properties. In these composite materials, the adhesion properties and the large number of functional groups that are presented in the CNC’s surface and the macromolecular chains of the polymer matrix are exploited to improve the interfacial interactions between the both phases, improving the final properties. Consequently, the high performances of these composite materials can be expected to have potential in packaging material applications.Keywords: cellulose nanowhiskers, food packaging, polymer composites, red algae waste
Procedia PDF Downloads 2282424 Reasons to Redesign: Teacher Education for a Brighter Tomorrow
Authors: Deborah L. Smith
Abstract:
To review our program and determine the best redesign options, department members gathered feedback and input through focus groups, analysis of data, and a review of the current research to ensure that the changes proposed were not based solely on the state’s new professional standards. In designing course assignments and assessments, we listened to a variety of constituents, including students, other institutions of higher learning, MDE webinars, host teachers, literacy clinic personnel, and other disciplinary experts. As a result, we are designing a program that is more inclusive of a variety of field experiences for growth. We have determined ways to improve our program by connecting academic disciplinary knowledge, educational psychology, and community building both inside and outside the classroom for professional learning communities. The state’s release of new professional standards led my department members to question what is working and what needs improvement in our program. One aspect of our program that continues to be supported by research and data analysis is the function of supervised field experiences with meaningful feedback. We seek to expand in this area. Other data indicate that we have strengths in modeling a variety of approaches such as cooperative learning, discussions, literacy strategies, and workshops. In the new program, field assignments will be connected to multiple courses, and efforts to scaffold student learning to guide them toward best evidence-based practices will be continuous. Despite running a program that meets multiple sets of standards, there are areas of need that we directly address in our redesign proposal. Technology is ever-changing, so it’s inevitable that improving digital skills is a focus. In addition, scaffolding procedures for English Language Learners (ELL) or other students who struggle is imperative. Diversity, equity, and inclusion (DEI) has been an integral part of our curriculum, but the research indicates that more self-reflection and a deeper understanding of culturally relevant practices would help the program improve. Connections with professional learning communities will be expanded, as will leadership components, so that teacher candidates understand their role in changing the face of education. A pilot program will run in academic year 22/23, and additional data will be collected each semester through evaluations and continued program review.Keywords: DEI, field experiences, program redesign, teacher preparation
Procedia PDF Downloads 1692423 Laser-Hole Boring into Overdense Targets: A Detailed Study on Laser and Target Properties
Authors: Florian Wagner, Christoph Schmidt, Vincent Bagnoud
Abstract:
Understanding the interaction of ultra-intense laser pulses with overcritical targets is of major interest for many applications such as laser-driven ion acceleration, fast ignition in the frame of inertial confinement fusion or high harmonic generation and the creation of attosecond pulses. One particular aspect of this interaction is the shift of the critical surface, where the laser pulse is stopped and the absorption is at maximum, due to the radiation pressure induced by the laser pulse, also referred to as laser hole boring. We investigate laser-hole boring experimentally by measuring the backscattered spectrum which is doppler-broadened because of the movement of the reflecting surface. Using the high-power, high-energy laser system PHELIX in Darmstadt, we gathered an extensive set of data for different laser intensities ranging from 10^18 W/cm2 to 10^21 W/cm2, two different levels of the nanosecond temporal contrast (10^6 vs. 10^11), elliptical and linear polarization and varying target configurations. In this contribution we discuss how the maximum velocity of the critical surface depends on these parameters. In particular we show that by increasing the temporal contrast the maximum hole boring velocity is decreased by more than a factor of three. Our experimental findings are backed by a basic analytical model based on momentum and mass conservation as well as particle in cell simulations. These results are of particular importance for fast ignition since they contribute to a better understanding of the transport of the ignitor pulse into the overdense region.Keywords: laser-hole boring, interaction of ultra-intense lasers with overcritical targets, fast ignition, relativistic laser motter interaction
Procedia PDF Downloads 4052422 A Comparative Study for Various Techniques Using WEKA for Red Blood Cells Classification
Authors: Jameela Ali, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy
Abstract:
Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifyig the red blood cells as normal, or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithm tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital-Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectivelyKeywords: red blood cells, classification, radial basis function neural networks, suport vector machine, k-nearest neighbors algorithm
Procedia PDF Downloads 4802421 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series
Procedia PDF Downloads 1432420 Fatty Acids and Inflammatory Protein Biomarkers in Freshly Frozen Plasma Samples from Patients with and without COVID-19
Authors: Alaa Hamed Habib
Abstract:
The Coronavirus disease 2019 (COVID-19) is a viral infection caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and associated with systemic inflammation. Inflammation is an important process that follows infection and facilitates the repair of damaged tissue. Polyunsaturated fatty acids play an important role in the inflammatory process. These lipids can target transcription factors to modulate gene expression and protein function. Here, we evaluated whether differences in basal levels of different types of biomarkers can be detected in freshly frozen plasma samples from patients with and without COVID19. Fatty acid methyl ester (FAME) analysis showed a decrease in arachidic acid and myristic acid, but an increase in caprylic acid, palmitic acid, and eicosenoic acid in the plasma of COVID-19 patients compared to non-COVID19 patients. Multiple chemokines, including IP-10, MCP-1, and MIP-1 beta, were increased in the COVID-19 group compared to the non-COVID-19 group. Similarly, cytokines including IL-1 alpha and IL-8, and cell adhesion and inflammatory response markers including ICAM-1 and E-selectin were greater in the plasma of COVID-19 patients compared to non-COVID-19 patients. A baseline signature of specific polyunsaturated fatty acids, cytokines, and chemokines present in the plasma after COVID-19 viral infection may serve as biomarkers that can be useful in various applications, including determination of the severity of infection, an indication of disease prognosis and consideration for therapeutic options.Keywords: MARKS, COVID 19, UEVS NON COVIDS, kidneys, nanoparticles
Procedia PDF Downloads 62419 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules
Authors: Mohsen Maraoui
Abstract:
In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing
Procedia PDF Downloads 1412418 Integrating One Health Approach with National Policies to Improve Health Security post-COVID-19 in Vietnam
Authors: Yasser Sanad, Thu Trang Dao
Abstract:
Introduction: Implementing the One Health (OH) approach requires an integrated, interdisciplinary, and cross-sectoral methodology. OH is a key tool for developing and implementing programs and projects and includes developing ambitious policies that consider the common needs and benefits of human, animal, plant, and ecosystem health. OH helps humanity readjust its path to environmentally friendly and impartial sustainability. As co-leader of the Global Health Security Agenda’s Zoonotic Disease Action Package, Vietnam pioneered a strong OH approach to effectively address early waves of the COVID-19 outbreak in-country. Context and Aim: The repeated surges in COVID-19 in Vietnam challenged the capabilities of the national system and disclosed the gaps in multi-sectoral coordination and resilience. To address this, FHI 360 advocated for the standardization of the OH platform by government actors to increase the resiliency of the system during and post COVID-19. Methods: FHI 360 coordinated technical resources to develop and implement evidence-based OH policies, promoting high-level policy dialogue between the Ministries of Health, Agriculture, and the Environment, and policy research to inform developed policies and frameworks. Through discussions, an OH-building Partnership (OHP) was formed, linking climate change, the environment, and human and animal health. Findings: The OHP Framework created a favorable policy environment within and between sectors, as well as between governments and international health security partners. It also promoted strategic dialogue, resource mobilization, policy advocacy, and integration of international systems with National Steering Committees to ensure accountability and emphasize national ownership. Innovative contribution to policy, practice and/or research: OHP was an effective evidence-based research-to-policy platform linking to the National One Health Strategic Plan (2021-2025). Collectively they serve as a national framework for the implementation and monitoring of OH activities. Through the adoption of policies and plans, the risk of zoonotic pathogens, environmental agent spillover, and antimicrobial resistance can be minimized through strengthening multi-sectoral OH collaboration for health security.Keywords: one health, national policies, health security, COVID-19, Vietnam
Procedia PDF Downloads 1052417 Communication Tools Used in Teaching and Their Effects: An Empirical Study on the T. C. Selcuk University Samples
Authors: Sedat Simsek, Tugay Arat
Abstract:
Today's communication concept, which has a great revolution with the printing press which has been found by Gutenberg, has no boundary thanks to advanced communication devices and the internet. It is possible to take advantage in many areas, such as from medicine to social sciences or from mathematics to education, from the computers that was first produced for the purpose of military services. The use of these developing technologies in the field of education has created a great vision changes in both training and having education. Materials, which can be considered as basic communication resources and used in traditional education has begun to lose its significance, and some technologies have begun to replace them such as internet, computers, smart boards, projection devices and mobile phone. On the other hand, the programs and applications used in these technologies have also been developed. University students use virtual books instead of the traditional printed book, use cell phones instead of note books, use the internet and virtual databases instead of the library to research. They even submit their homework with interactive methods rather than printed materials. The traditional education system, these technologies, which increase productivity, have brought a new dimension to education. The aim of this study is to determine the influence of technologies in the learning process of students and to find whether is there any similarities and differences that arise from the their faculty that they have been educated and and their learning process. In addition to this, it is aimed to determine the level of ICT usage of students studying at the university level. In this context, the advantages and conveniences of the technology used by students are also scrutinized. In this study, we used surveys to collect data. The data were analyzed by using SPSS 16 statistical program with the appropriate testing.Keywords: education, communication technologies, role of technology, teaching
Procedia PDF Downloads 3032416 An Investigation of the Fracture Behavior of Model MgO-C Refractories Using the Discrete Element Method
Authors: Júlia Cristina Bonaldo, Christophe L. Martin, Martiniano Piccico, Keith Beale, Roop Kishore, Severine Romero-Baivier
Abstract:
Refractory composite materials employed in steel casting applications are prone to cracking and material damage because of the very high operating temperature (thermal shock) and mismatched properties of the constituent phases. The fracture behavior of a model MgO-C composite refractory is investigated to quantify and characterize its thermal shock resistance, employing a cold crushing test and Brazilian test with fractographic analysis. The discrete element method (DEM) is used to generate numerical refractory composites. The composite in DEM is represented by an assembly of bonded particle clusters forming perfectly spherical aggregates and single spherical particles. For the stresses to converge with a low standard deviation and a minimum number of particles to allow reasonable CPU calculation time, representative volume element (RVE) numerical packings are created with various numbers of particles. Key microscopic properties are calibrated sequentially by comparing stress-strain curves from crushing experimental data. Comparing simulations with experiments also allows for the evaluation of crack propagation, fracture energy, and strength. The crack propagation during Brazilian experimental tests is monitored with digital image correlation (DIC). Simulations and experiments reveal three distinct types of fracture. The crack may spread throughout the aggregate, at the aggregate-matrix interface, or throughout the matrix.Keywords: refractory composite, fracture mechanics, crack propagation, DEM
Procedia PDF Downloads 812415 Defining Death and Dying in Relation to Information Technology and Advances in Biomedicine
Authors: Evangelos Koumparoudis
Abstract:
The definition of death is a deep philosophical question, and no single meaning can be ascribed to it. This essay focuses on the ontological, epistemological, and ethical aspects of death and dying in view of technological progress in information technology and biomedicine. It starts with the ad hoc 1968 Harvard committee that proposed that the criterion for the definition of death be irreversible coma and then refers to the debate over the whole brain death formula, emphasizing the integrated function of the organism and higher brain formula, taking consciousness and personality as essential human characteristics. It follows with the contribution of information technology in personalized and precision medicine and anti-aging measures aimed at life prolongation. It also touches on the possibility of the creation of human-machine hybrids and how this raises ontological and ethical issues that concern the “cyborgization” of human beings and the conception of the organism and personhood based on a post/transhumanist essence, and, furthermore, if sentient AI capable of autonomous decision-making that might even surpass human intelligence (singularity, superintelligence) deserves moral or legal personhood. Finally, there is the question as to whether death and dying should be redefined at a transcendent level, which is reinforced by already-existing technologies of “virtual after-” life and the possibility of uploading human minds. In the last section, I refer to the current (and future) applications of nanomedicine in diagnostics, therapeutics, implants, and tissue engineering as well as the aspiration to “immortality” by cryonics. The definition of death is reformulated since age and disease elimination may be realized, and the criterion of irreversibility may be challenged.Keywords: death, posthumanism, infomedicine, nanomedicine, cryonics
Procedia PDF Downloads 702414 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method
Authors: Angel G. De Leon Hernandez
Abstract:
A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming
Procedia PDF Downloads 1222413 Self-Organized TiO₂–Nb₂O₅–ZrO₂ Nanotubes on β-Ti Alloy by Anodization
Authors: Muhammad Qadir, Yuncang Li, Cuie Wen
Abstract:
Surface properties such as topography and physicochemistry of metallic implants determine the cell behavior. The surface of titanium (Ti)-based implant can be modified to enhance the bioactivity and biocompatibility. In this study, a self-organized titania–niobium pentoxide–zirconia (TiO₂–Nb₂O₅–ZrO₂) nanotubular layer on β phase Ti35Zr28Nb alloy was fabricated via electrochemical anodization. Energy-dispersive X-ray spectroscopy (EDX), scanning electron microscopy (SEM), X-ray photoelectron spectroscopy (XPS) and water contact angle measurement techniques were used to investigate the nanotubes dimensions (i.e., the inner and outer diameters, and wall thicknesses), microstructural features and evolution of the hydrophilic properties. The in vitro biocompatibility of the TiO₂–Nb₂O₅–ZrO₂ nanotubes (NTs) was assessed by using osteoblast cells (SaOS2). Influence of anodization parameters on the morphology of TiO₂–Nb₂O₅–ZrO₂ NTs has been studied. The results indicated that the average inner diameter, outer diameter and the wall thickness of the TiO₂–Nb₂O₅–ZrO₂ NTs were ranged from 25–70 nm, 45–90 nm and 5–13 nm, respectively, and were directly influenced by the applied voltage during anodization. The average inner and outer diameters of NTs increased with increasing applied voltage, and the length of NTs increased with increasing anodization time and water content of the electrolyte. In addition, the size distribution of the NTs noticeably affected the hydrophilic properties and enhanced the biocompatibility as compared with the uncoated substrate. The results of this study could be considered for developing nano-scale coatings for a wide range of biomedical applications.Keywords: Titanium alloy, TiO₂–Nb₂O₅–ZrO₂ nanotubes, anodization, surface wettability, biocompatibility
Procedia PDF Downloads 1552412 'Light up for All': Building Knowledge on Universal Design through Direct User Contact in Design Workshops
Authors: E. Ielegems, J. Herssens, J. Vanrie
Abstract:
Designers require knowledge and data about a diversity of users throughout the design process to create inclusive design solutions which are usable, understandable and desirable by everyone. Besides understanding users’ needs and expectations, the ways in which users perceive and experience the built environment contain valuable knowledge for architects. Since users’ perceptions and experiences are mainly tacit by nature, they are much more difficult to express in words and therefore more difficult to externalise. Nevertheless, literature confirms the importance of articulating embodied knowledge from users throughout the design process. Hence, more insight is needed into the ways architects can build knowledge on Universal Design through direct user contact. In a project called ‘light up for all’ architecture students are asked to design a light switch and socket, elegant, usable and understandable to the greatest extent possible by everyone. Two workshops with user/experts are organised in the first stages of the design process in which students could gain insight into users’ experiences through direct contact. Three data collection techniques are used to analyse the teams’ design processes. First, students were asked to keep a design diary, reporting design activities, personal experiences, and thoughts about users throughout the design process. Second, one of the authors observed workshops taking field notes. Finally, focus groups are conducted with the design teams after the design process was finished. By means of analysing collected qualitative data, we first identify different design aspects that make the teams’ proposals more inclusive than standard design solutions. For this paper, we specifically focus on aspects that externalise embodied user knowledge from users’ experiences. Subsequently, we look at designers’ approaches to learn about these specific aspects throughout the design process. Results show that in some situations, designers perceive contradicting knowledge between observations and verbal conversations, which shows the value of direct user contact. Additionally, findings give indications on values and limitations of working with selected prototypes as ‘boundary objects’ when externalising users’ experiences. These insights may help researchers to better understand designers’ process of eliciting embodied user knowledge. This way, research can offer more effective support to architects, which may result in better incorporating users’ experiences so that the built environment gradually can become more inclusive for all.Keywords: universal design, architecture, design process, embodied user knowledge
Procedia PDF Downloads 1432411 Investigation of Resistive Switching in CsPbCl₃ / Cs₄PbCl₆ Core-Shell Nanocrystals Using Scanning Tunneling Spectroscopy: A Step Towards High Density Memory-based Applications
Authors: Arpan Bera, Rini Ganguly, Raja Chakraborty, Amlan J. Pal
Abstract:
To deal with the increasing demands for the high-density non-volatile memory devices, we need nano-sites with efficient and stable charge storage capabilities. We prepared nanocrystals (NCs) of inorganic perovskite, CsPbCl₃ coated with Cs₄PbCl₆, by colloidal synthesis. Due to the type-I band alignment at the junction, this core-shell composite is expected to behave as a charge trapping site. Using Scanning Tunneling Spectroscopy (STS), we investigated voltage-controlled resistive switching in this heterostructure by tracking the change in its current-voltage (I-V) characteristics. By applying voltage pulse of appropriate magnitude on the NCs through this non-invasive method, different resistive states of this system were systematically accessed. For suitable pulse-magnitude, the response jumped to a branch with enhanced current indicating a high-resistance state (HRS) to low-resistance state (LRS) switching in the core-shell NCs. We could reverse this process by using a pulse of opposite polarity. These two distinct resistive states can be considered as two logic states, 0 and 1, which are accessible by varying voltage magnitude and polarity. STS being a local probe in space enabled us to capture this switching at individual NC site. Hence, we claim a bright prospect of these core-shell NCs made of inorganic halide perovskites in future high density memory application.Keywords: Core-shell perovskite, CsPbCl₃-Cs₄PbCl₆, resistive switching, Scanning Tunneling Spectroscopy
Procedia PDF Downloads 892410 Chemical Composition, in vitro Antioxidant Activity and Gas Chromatography–Mass Spectrometry Analysis of Essential Oil and Extracts of Ruta chalpensis aerial Parts Growing in Tunisian Sahara
Authors: Samir Falhi, Neji Gharsallah, Adel Kadri
Abstract:
Ruta chalpensis L. is a medicinal plant in the family of Rutaceae, has been used as an important traditional in the Mediterranean basin in the treatment of many diseases. The current study was devoted to investigate and evaluate the chemical composition, total phenolic, flavonoid and tannin contents, and in vitro antioxidant activities of ethyl acetate, ethanol and hydroalcoholic extracts and essential oil from the aerial parts of Ruta chalpensis from Tunisian Sahara. Total phenolic, flavonoid and tannin contents of extracts ranged from 40.39 ± 1.87 to 75.13 ± 1.22 mg of GAE/g, from 22.62 ± 1.55 to 27.51 ± 1.04 mg of QE/g, and from 5.56 ± 1.32 to 10.89 ± 1.10 mg of CE/g respectively. Results showed that the highest antioxidant activities was determined for ethanol extract with IC50 value of 26.23 ± 0.91 µg/mL for 2,2-diphenyl-1-picrylhydrazyl assay, and for hydroalcoholic extract with EC50 value of 412.95±6.57 µg/mL and 105.52±2.45 mg of α-tocopherol/g for ferric reducing antioxidant power and total antioxidant capacity assays, respectively. Furthermore, Gas Chromatography–Mass Spectrometry (GC-MS) analysis of essential oil led to identification of 20 compounds representing 98.96 % of the total composition. The major components of essential oil were 2-undecanone (39.13%), 2-nonanone (25.04), 1-nonene (13.81), and α-limonene (7.72). Spectral data of Fourier-transform infrared spectroscopy analysis (FT-IR) of extracts revealed the presence of functional groups such as C= O, C─O, ─OH, and C─H, which confirmed its richness on polyphenols and biological active functional groups. These results showed that Ruta chalpensis could be a potential natural source of antioxidants that can be used in food and nutraceutical applications.Keywords: antioxidant, FT-IR analysis, GC-MS analysis, phytochemicals contents, Ruta chalpensis
Procedia PDF Downloads 1472409 Liver and Liver Lesion Segmentation From Abdominal CT Scans
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithmKeywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm
Procedia PDF Downloads 4512408 A Linear Regression Model for Estimating Anxiety Index Using Wide Area Frontal Lobe Brain Blood Volume
Authors: Takashi Kaburagi, Masashi Takenaka, Yosuke Kurihara, Takashi Matsumoto
Abstract:
Major depressive disorder (MDD) is one of the most common mental illnesses today. It is believed to be caused by a combination of several factors, including stress. Stress can be quantitatively evaluated using the State-Trait Anxiety Inventory (STAI), one of the best indices to evaluate anxiety. Although STAI scores are widely used in applications ranging from clinical diagnosis to basic research, the scores are calculated based on a self-reported questionnaire. An objective evaluation is required because the subject may intentionally change his/her answers if multiple tests are carried out. In this article, we present a modified index called the “multi-channel Laterality Index at Rest (mc-LIR)” by recording the brain activity from a wider area of the frontal lobe using multi-channel functional near-infrared spectroscopy (fNIRS). The presented index aims to measure multiple positions near the Fpz defined by the international 10-20 system positioning. Using 24 subjects, the dependencies on the number of measuring points used to calculate the mc-LIR and its correlation coefficients with the STAI scores are reported. Furthermore, a simple linear regression was performed to estimate the STAI scores from mc-LIR. The cross-validation error is also reported. The experimental results show that using multiple positions near the Fpz will improve the correlation coefficients and estimation than those using only two positions.Keywords: frontal lobe, functional near-infrared spectroscopy, state-trait anxiety inventory score, stress
Procedia PDF Downloads 2502407 Examining the Skills of Establishing Number and Space Relations of Science Students with the 'Integrative Perception Test'
Authors: Ni̇sa Yeni̇kalayci, Türkan Aybi̇ke Akarca
Abstract:
The ability of correlation the number and space relations, one of the basic scientific process skills, is being used in the transformation of a two-dimensional object into a three-dimensional image or in the expression of symmetry axes of the object. With this research, it is aimed to determine the ability of science students to establish number and space relations. The research was carried out with a total of 90 students studying in the first semester of the Science Education program of a state university located in the Turkey’s Black Sea Region in the fall semester of 2017-2018 academic year. An ‘Integrative Perception Test (IPT)’ was designed by the researchers to collect the data. Within the scope of IPT, the courses and workbooks specific to the field of science were scanned and the ones without symmetrical structure from the visual items belonging to the ‘Physics - Chemistry – Biology’ sub-fields were selected and listed. During the application, it was expected that students would imagine and draw images of the missing half of the visual items that were given incomplete in the first place. The data obtained from the test in which there are 30 images or pictures in total (f Physics = 10, f Chemistry = 10, f Biology = 10) were analyzed descriptively based on the drawings created by the students as ‘complete (2 points), incomplete/wrong (1 point), empty (0 point)’. For the teaching of new concepts in small aged groups, images or pictures showing symmetrical structures and similar applications can also be used.Keywords: integrative perception, number and space relations, science education, scientific process skills
Procedia PDF Downloads 1522406 Protein Extraction by Enzyme-Assisted Extraction followed by Alkaline Extraction from Red Seaweed Eucheuma denticulatum (Spinosum) Used in Carrageenan Production
Authors: Alireza Naseri, Susan L. Holdt, Charlotte Jacobsen
Abstract:
In 2014, the global amount of carrageenan production was 60,000 ton with a value of US$ 626 million. From this number, it can be estimated that the total dried seaweed consumption for this production was at least 300,000 ton/year. The protein content of these types of seaweed is 5 – 25%. If just half of this total amount of protein could be extracted, 18,000 ton/year of a high-value protein product would be obtained. The overall aim of this study was to develop a technology that will ensure further utilization of the seaweed that is used only as raw materials for carrageenan production as single extraction at present. More specifically, proteins should be extracted from the seaweed either before or after extraction of carrageenan with focus on maintaining the quality of carrageenan as a main product. Different mechanical, chemical and enzymatic technologies were evaluated. The optimized process was implemented in lab scale and based on its results; the new experiments were done a pilot and larger scale. In order to calculate the efficiency of the new upstream multi-extraction process, protein content was tested before and after extraction. After this step, the extraction of carrageenan was done and carrageenan content and the effect of extraction on yield were evaluated. The functionality and quality of carrageenan were measured based on rheological parameters. The results showed that by using the new multi-extraction process (submitted patent); it is possible to extract almost 50% of total protein without any negative impact on the carrageenan quality. Moreover, compared to the routine carrageenan extraction process, the new multi-extraction process could increase the yield of carrageenan and the rheological properties such as gel strength in the final carrageenan had a promising improvement. The extracted protein has initially been screened as a plant protein source in typical food applications. Further work will be carried out in order to improve properties such as color, solubility, and taste.Keywords: carrageenan, extraction, protein, seaweed
Procedia PDF Downloads 2842405 Shaping of World-Class Delhi: Politics of Marginalization and Inclusion
Authors: Aparajita Santra
Abstract:
In the context of the government's vision of turning Delhi into a green, privatized and slum free city, giving it a world-class image at par with the global cities of the world, this paper investigates into the various processes and politics of things that went behind defining spaces in the city and attributing an aesthetic image to it. The paper will explore two cases that were forged primarily through the forces of one particular type of power relation. One would be to look at the modernist movement adopted by the Nehruvian government post-independence and the next case will look at special periods like Emergency and Commonwealth games. The study of these cases will help understand the ambivalence embedded in the different rationales of the Government and different powerful agencies adopted in order to build world-classness. Through the study, it will be easier to discern how city spaces were reconfigured in the name of 'good governance'. In this process, it also became important to analyze the double nature of law, both as a protector of people’s rights and as a threat to people. What was interesting to note through the study was that in the process of nation building and creating an image for the city, the government’s policies and programs were mostly aimed at the richer sections of the society and the poorer sections and people from lower income groups kept getting marginalized, subdued, and pushed further away (These marginalized people were pushed away even geographically!). The reconfiguration of city space and attributing an aesthetic character to it, led to an alteration not only in the way in which citizens perceived and engaged with these spaces, but also brought about changes in the way they envisioned their place in the city. Ironically, it was found that every attempt to build any kind of facility for the city’s elite in turn led to an inevitable removal of the marginalized sections of the society as a necessary step to achieve a clean, green and world-class city. The paper questions the claim made by the government for creating a just, equitable city and granting rights to all. An argument is put forth that in the politics of redistribution of space, the city that has been designed is meant for the aspirational middle-class and elite only, who are ideally primed to live in world-class cities. Thus, the aim is to study city spaces, urban form, the associated politics and power plays involved within and understand whether segmented cities are being built in the name of creating sensible, inclusive cities.Keywords: aesthetics, ambivalence, governmentality, power, World-class
Procedia PDF Downloads 117