Search results for: computational thought
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2806

Search results for: computational thought

2296 Calculation of Orbital Elements for Sending Interplanetary Probes

Authors: Jorge Lus Nisperuza Toledo, Juan Pablo Rubio Ospina, Daniel Santiago Umana, Hector Alejandro Alvarez

Abstract:

This work develops and implements computational codes to calculate the optimal launch trajectories for sending a probe from the earth to different planets of the Solar system, making use of trajectories of the Hohmann and No-Hohmann type and gravitational assistance in intermediate steps. Specifically, the orbital elements, the graphs and the dynamic simulations of the trajectories for sending a probe from the Earth towards the planets Mercury, Venus, Mars, Jupiter, and Saturn are obtained. A detailed study was made of the state vectors of the position and orbital velocity of the considered planets in order to determine the optimal trajectories of the probe. For this purpose, computer codes were developed and implemented to obtain the orbital elements of the Mariner 10 (Mercury), Magellan (Venus), Mars Global Surveyor (Mars) and Voyager 1 (Jupiter and Saturn) missions, as an exercise in corroborating the algorithms. This exercise gives validity to computational codes, allowing to find the orbital elements and the simulations of trajectories of three future interplanetary missions with specific launch windows.

Keywords: gravitational assistance, Hohmann’s trajectories, interplanetary mission, orbital elements

Procedia PDF Downloads 161
2295 Documenting the 15th Century Prints with RTI

Authors: Peter Fornaro, Lothar Schmitt

Abstract:

The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.

Keywords: art history, computational photography, paste prints, reflectance transformation imaging

Procedia PDF Downloads 262
2294 Generation & Migration Of Carbone Dioxid In The Lower Cretaceous Bahi Sandstone Reservoir Within The En-naga Sub Basin, Sirte Basin, Libya

Authors: Moaawia Abdulgader Gdara

Abstract:

En -Naga sub - basin considered to be the most southern of the concessions in the Sirte Basin operated by HOO. En Naga Sub – basin have likely been point-sourced of CO₂ accumulations during the last 7 million years from local satellite intrusives associated with the Haruj Al Aswad igneous complex. CO2 occurs in the En Naga Sub-basin as a result of the igneous activity of the Al Harouge Al Aswad complex.Igneous extrusive have been pierced in the subsurface are exposed at the surface. The lower cretaceous Bahi Sandstone facies are recognized in the En Naga Sub-basin. They result from the influence of paleotopography on the processes associated with continental deposition over the Sirt Unconformity and the Cenomanian marine transgression In the Lower Cretaceous Bahi Sandstones, the presence of trapped carbon dioxide is proven within the En Naga Sub-basin. This makes it unique in providing an abundance of CO₂ gas reservoirs with almost pure magmatic CO₂, which can be easily sampled. Huge amounts of CO2 exist in the Lower Cretaceous Bahi Sandstones in the En-Naga sub-basin, where the economic value of CO₂ is related to its use for enhanced oil recovery (EOR) Based on the production tests for the drilled wells that makes Lower Cretaceous Bahi sandstones the principle reservoir rocks for CO2 where large volumes of CO2 gas have been discovered in the Bahi Formation on and near EPSA 120/136(En -Naga sub basin). The Bahi sandstones are generally described as a good reservoir rock. Intergranular porosities and permeabilities are highly variable and can exceed 25% and 100 MD. In the (En Naga sub – basin), three main developed structures (Barrut I, En Naga A and En Naga O) are thought to be prospective for the lower Cretaceous Bahi sandstone reservoir. These structures represents a good example for the deep over pressure potential in (En Naga sub - basin). The very high pressures assumed associated with local igneous intrusives may account for the abnormally high Bahi (and Lidam) reservoir pressures. The best gas tests from this facies are at F1-72 on the (Barrut I structure) from part of a 458 feet+ section having an estimated high value of CO2 as 98% overpressured. Bahi CO) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co₂ generation and migration to the bahi sandstone reservoir generation and migration to the bahi sandstone reservoir generation and migration to the bahi sandstone reservoir generation and migration to the bahi sandstone reservoir prospectivity is thought to be excellent in the central to western areas where At U1-72 (En Naga O structure) a significant CO2 gas kick occurred at 11,971 feet and quickly led to blowout conditions due to uncontrollable leaks in the surface equipment. Which reflects a better reservoir quality sandstones associated with Paleostructural highs. Condensate and gas prospectivity increases to the east as the CO₂) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co₂ generation and migration to the bahi sandstone reservoir generation and migration to the bahi sandstone reservoir prospectivity decreases with distance away from the Al Haruj Al Aswad igneous complex. To date, it has not been possible to accurately determine the volume of these strategically valuable reserves although there are positive indications that they are very large.

Keywords: 1) en naga sub basin, 2)al harouge al aswad igneous complex., 3) lower cretaceous bahi reservoir, 4)co2 generation and migration to the bahi sandstone reservoir

Procedia PDF Downloads 47
2293 An Integrative Computational Pipeline for Detection of Tumor Epitopes in Cancer Patients

Authors: Tanushree Jaitly, Shailendra Gupta, Leila Taher, Gerold Schuler, Julio Vera

Abstract:

Genomics-based personalized medicine is a promising approach to fight aggressive tumors based on patient's specific tumor mutation and expression profiles. A remarkable case is, dendritic cell-based immunotherapy, in which tumor epitopes targeting patient's specific mutations are used to design a vaccine that helps in stimulating cytotoxic T cell mediated anticancer immunity. Here we present a computational pipeline for epitope-based personalized cancer vaccines using patient-specific haplotype and cancer mutation profiles. In the workflow proposed, we analyze Whole Exome Sequencing and RNA Sequencing patient data to detect patient-specific mutations and their expression level. Epitopes including the tumor mutations are computationally predicted using patient's haplotype and filtered based on their expression level, binding affinity, and immunogenicity. We calculate binding energy for each filtered major histocompatibility complex (MHC)-peptide complex using docking studies, and use this feature to select good epitope candidates further.

Keywords: cancer immunotherapy, epitope prediction, NGS data, personalized medicine

Procedia PDF Downloads 237
2292 A Study on Thermal and Flow Characteristics by Solar Radiation for Single-Span Greenhouse by Computational Fluid Dynamics Simulation

Authors: Jonghyuk Yoon, Hyoungwoon Song

Abstract:

Recently, there are lots of increasing interest in a smart farming that represents application of modern Information and Communication Technologies (ICT) into agriculture since it provides a methodology to optimize production efficiencies by managing growing conditions of crops automatically. In order to obtain high performance and stability for smart greenhouse, it is important to identify the effect of various working parameters such as capacity of ventilation fan, vent opening area and etc. In the present study, a 3-dimensional CFD (Computational Fluid Dynamics) simulation for single-span greenhouse was conducted using the commercial program, Ansys CFX 18.0. The numerical simulation for single-span greenhouse was implemented to figure out the internal thermal and flow characteristics. In order to numerically model solar radiation that spread over a wide range of wavelengths, the multiband model that discretizes the spectrum into finite bands of wavelength based on Wien’s law is applied to the simulation. In addition, absorption coefficient of vinyl varied with the wavelength bands is also applied based on Beer-Lambert Law. To validate the numerical method applied herein, the numerical results of the temperature at specific monitoring points were compared with the experimental data. The average error rates (12.2~14.2%) between them was shown and numerical results of temperature distribution are in good agreement with the experimental data. The results of the present study can be useful information for the design of various greenhouses. This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, Forestry and Fisheries (IPET) through Advanced Production Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA)(315093-03).

Keywords: single-span greenhouse, CFD (computational fluid dynamics), solar radiation, multiband model, absorption coefficient

Procedia PDF Downloads 117
2291 Organ Dose Calculator for Fetus Undergoing Computed Tomography

Authors: Choonsik Lee, Les Folio

Abstract:

Pregnant patients may undergo CT in emergencies unrelated with pregnancy, and potential risk to the developing fetus is of concern. It is critical to accurately estimate fetal organ doses in CT scans. We developed a fetal organ dose calculation tool using pregnancy-specific computational phantoms combined with Monte Carlo radiation transport techniques. We adopted a series of pregnancy computational phantoms developed at the University of Florida at the gestational ages of 8, 10, 15, 20, 25, 30, 35, and 38 weeks (Maynard et al. 2011). More than 30 organs and tissues and 20 skeletal sites are defined in each fetus model. We calculated fetal organ dose-normalized by CTDIvol to derive organ dose conversion coefficients (mGy/mGy) for the eight fetuses for consequential slice locations ranging from the top to the bottom of the pregnancy phantoms with 1 cm slice thickness. Organ dose from helical scans was approximated by the summation of doses from multiple axial slices included in the given scan range of interest. We then compared dose conversion coefficients for major fetal organs in the abdominal-pelvis CT scan of pregnancy phantoms with the uterine dose of a non-pregnant adult female computational phantom. A comprehensive library of organ conversion coefficients was established for the eight developing fetuses undergoing CT. They were implemented into an in-house graphical user interface-based computer program for convenient estimation of fetal organ doses by inputting CT technical parameters as well as the age of the fetus. We found that the esophagus received the least dose, whereas the kidneys received the greatest dose in all fetuses in AP scans of the pregnancy phantoms. We also found that when the uterine dose of a non-pregnant adult female phantom is used as a surrogate for fetal organ doses, root-mean-square-error ranged from 0.08 mGy (8 weeks) to 0.38 mGy (38 weeks). The uterine dose was up to 1.7-fold greater than the esophagus dose of the 38-week fetus model. The calculation tool should be useful in cases requiring fetal organ dose in emergency CT scans as well as patient dose monitoring.

Keywords: computed tomography, fetal dose, pregnant women, radiation dose

Procedia PDF Downloads 118
2290 On the Impracticality of Kierkegaard's Community of Authentic Individuals

Authors: Andrew Ka Pok Tam

Abstract:

Kierkegaard has been misinterpreted as an anti-social philosopher for a long time until in recent years when there are more discussions on his concept of community in Journals and Papers inspired by Karl Bayer. Community which is based upon an individual's relations to others is different from the crowd or the public where the numerical or the majority make decisions. As a result, authenticity is only possible in the community. But Kierkegaard did not explain how we can preserve the individual's authenticity by establishing a community instead of a public in the reality. Kierkegaard was against the democratic reform in 1848 Denmark because he thought all elections mean the majority wins and the authenticity of a single individual would be suppressed. However, Kierkegaard himself does not suggest an alternative political system that may preserve the authenticity of individual. This paper aims to evaluate the possibility for us to establish a Kierkegaadian community in practice so as to preserve every individual's authenticity. This paper argues that the practicality of Kierekegaadian community is limited. In order to have effective communications and relations among individuals, a Kierkegaardian community must be small and inefficient as every individual's must remain authentic in all political decision for the whole community.

Keywords: authenticity, community, individual, kierkegaard

Procedia PDF Downloads 332
2289 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 85
2288 CFD Analysis of Multi-Phase Reacting Transport Phenomena in Discharge Process of Non-Aqueous Lithium-Air Battery

Authors: Jinliang Yuan, Jong-Sung Yu, Bengt Sundén

Abstract:

A computational fluid dynamics (CFD) model is developed for rechargeable non-aqueous electrolyte lithium-air batteries with a partial opening for oxygen supply to the cathode. Multi-phase transport phenomena occurred in the battery are considered, including dissolved lithium ions and oxygen gas in the liquid electrolyte, solid-phase electron transfer in the porous functional materials and liquid-phase charge transport in the electrolyte. These transport processes are coupled with the electrochemical reactions at the active surfaces, and effects of discharge reaction-generated solid Li2O2 on the transport properties and the electrochemical reaction rate are evaluated and implemented in the model. The predicted results are discussed and analyzed in terms of the spatial and transient distribution of various parameters, such as local oxygen concentration, reaction rate, variable solid Li2O2 volume fraction and porosity, as well as the effective diffusion coefficients. It is found that the effect of the solid Li2O2 product deposited at the solid active surfaces is significant on the transport phenomena and the overall battery performance.

Keywords: Computational Fluid Dynamics (CFD), modeling, multi-phase, transport phenomena, lithium-air battery

Procedia PDF Downloads 424
2287 Fundamentals of Theorizing Power in International Relations

Authors: Djehich Mohamed Yousri

Abstract:

The field of political science is one of the sciences in which there is much controversy, in terms of the multiplicity of schools, trends, and goals. This overlap and complexity in the interpretation of the political phenomenon in political science has been linked to other disciplines associated with it, and the science of international relations and the huge amount of theories that have found a wide range and a decisive position after the national tide in the history of Western political thought, especially after the Westphalia Conference 1648, and as a result was approved The new foundations of international politics, the most important of which is respect for state sovereignty. Historical events continued and coincided with scientific, intellectual, and economic developments following the emergence of the industrial revolution, followed by the technological revolutions in all their contents, which led to the rooting and establishment of a comprehensive political system that is more complex and overlapping than it was in the past during the First and Second World Wars. The international situation has become dependent on the digital revolution and its aspirations in The comprehensive transformation witnessed by international political relations after the Cold War.

Keywords: theorizing, international relations, approaches to international relations, political science, the political system

Procedia PDF Downloads 81
2286 Pastoral Care and Counseling and Psychology as Sciences of Human Caring: Exploring the Interconnectedness of the Two Disciplines

Authors: Baloyi Gift Tlharihani

Abstract:

This paper explores the relationship between pastoral care and counselling and psychology. It will critically review the variety of views and debates regarding this relationship while acknowledging the different sides of the debates on the sameness and difference of these notions, this paper argues for the inevitable interconnectedness of the two. There has always been a close relationship, between pastoral care and counselling and psychology, although these are two totally different notions. Even though pastoral care and counselling are thought of as more spiritually focused and psychology with emotional and mental challenges, the components that connect these two sciences are represented by the care of human being. Therefore, this paper is interested in the interconnectedness of these two science as they both makes a vital contribution to human caring. It indicates that whether we take the dualistic difference between the body and soul, the trichotomous difference between the body, soul and spirit, our essential nature is found in the unity of those constituent elements.

Keywords: anthropology, human care, pastoral care and counseling, psychology

Procedia PDF Downloads 267
2285 Aerodynamic Design of a Light Long Range Blended Wing Body Unmanned Vehicle

Authors: Halison da Silva Pereira, Ciro Sobrinho Campolina Martins, Vitor Mainenti Leal Lopes

Abstract:

Long range performance is a goal for aircraft configuration optimization. Blended Wing Body (BWB) is presented in many works of literature as the most aerodynamically efficient design for a fixed-wing aircraft. Because of its high weight to thrust ratio, BWB is the ideal configuration for many Unmanned Aerial Vehicle (UAV) missions on geomatics applications. In this work, a BWB aerodynamic design for typical light geomatics payload is presented. Aerodynamic non-dimensional coefficients are predicted using low Reynolds number computational techniques (3D Panel Method) and wing parameters like aspect ratio, taper ratio, wing twist and sweep are optimized for high cruise performance and flight quality. The methodology of this work is a summary of tailless aircraft wing design and its application, with appropriate computational schemes, to light UAV subjected to low Reynolds number flows leads to conclusions like the higher performance and flight quality of thicker airfoils in the airframe body and the benefits of using aerodynamic twist rather than just geometric.

Keywords: blended wing body, low Reynolds number, panel method, UAV

Procedia PDF Downloads 564
2284 Computational Aerodynamic Shape Optimisation Using a Concept of Control Nodes and Modified Cuckoo Search

Authors: D. S. Naumann, B. J. Evans, O. Hassan

Abstract:

This paper outlines the development of an automated aerodynamic optimisation algorithm using a novel method of parameterising a computational mesh by employing user–defined control nodes. The shape boundary movement is coupled to the movement of the novel concept of the control nodes via a quasi-1D-linear deformation. Additionally, a second order smoothing step has been integrated to act on the boundary during the mesh movement based on the change in its second derivative. This allows for both linear and non-linear shape transformations dependent on the preference of the user. The domain mesh movement is then coupled to the shape boundary movement via a Delaunay graph mapping. A Modified Cuckoo Search (MCS) algorithm is used for optimisation within the prescribed design space defined by the allowed range of control node displacement. A finite volume compressible NavierStokes solver is used for aerodynamic modelling to predict aerodynamic design fitness. The resulting coupled algorithm is applied to a range of test cases in two dimensions including the design of a subsonic, transonic and supersonic intake and the optimisation approach is compared with more conventional optimisation strategies. Ultimately, the algorithm is tested on a three dimensional wing optimisation case.

Keywords: mesh movement, aerodynamic shape optimization, cuckoo search, shape parameterisation

Procedia PDF Downloads 313
2283 Parameters of Minimalistic Mosque in India within Minimalism

Authors: Hafila Banu

Abstract:

Minimalism is a postmodern style movement which emerged in the 50s of the twentieth century, but it was rapidly growing in the years of 60s and 70s. Minimalism is defined as the concept of minimizing distractions from what is truly valuable or essential. On the same grounds, works of minimalism offer a direct view at and raises questions about the true nature of the subject or object inviting the viewer to consider it for it for the real shape, a thought, a movement reminding us to focus on what is really important. The Architecture of Minimalism is characterized by an economy with materials , focusing on building quality with considerations for ‘essences’ as light, form, detail of material, texture, space and scale, place and human conditions . The research of this paper is mainly into the basis of designing a minimalistic mosque in India while analysing the parameters for the design from the matching characteristics of Islamic architecture in specific to a mosque and the minimalism. Therefore, the paper is about the mosque architecture and minimalism and of their underlying principles, matching characteristics and design goals.

Keywords: Islamic architecture, minimalism, minimalistic mosque, mosque in India

Procedia PDF Downloads 181
2282 Information Theoretic Approach for Beamforming in Wireless Communications

Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif

Abstract:

Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.

Keywords: beamforming, interference, mutual information, wireless communications

Procedia PDF Downloads 258
2281 Embodied Neoliberalism and the Mind as Tool to Manage the Body: A Descriptive Study Applied to Young Australian Amateur Athletes

Authors: Alicia Ettlin

Abstract:

Amid the rise of neoliberalism to the leading economic policy model in Western societies in the 1980s, people have started to internalise a neoliberal way of thinking, whereby the human body has become an entity that can and needs to be precisely managed through free yet rational decision-making processes. The neoliberal citizen has consequently become an entrepreneur of the self who is free, independent, rational, productive and responsible for themselves, their health and wellbeing as well as their appearance. The focus on individuals as entrepreneurs who manage their bodies through the rationally thinking mind has, however, become increasingly criticised for viewing the social actor as ‘disembodied’, as a detached, social actor whose powerful mind governs over the passive body. On the other hand, the discourse around embodiment seeks to connect rational decision-making processes to the dominant neoliberal discourse which creates an embodied understanding that the body, just as other areas of people’s lives, can and should be shaped, monitored and managed through cognitive and rational thinking. This perspective offers an understanding of the body regarding its connections with the social environment that reaches beyond the debates around mind-body binary thinking. Hence, following this argument, body management should not be thought of as either solely guided by embodied discourses nor as merely falling into a mind-body dualism, but rather, simultaneously and inseparably as both at once. The descriptive, qualitative analysis of semi-structured in-depth interviews conducted with young Australian amateur athletes between the age of 18 and 24 has shown that most participants are interested in measuring and managing their body to create self-knowledge and self-improvement. The participants thereby connected self-improvement to weight loss, muscle gain or simply staying fit and healthy. Self-knowledge refers to body measurements including weight, BMI or body fat percentage. Self-management and self-knowledge that are reliant on one another to take rational and well-thought-out decisions, are both characteristic values of the neoliberal doctrine. A neoliberal way of thinking and looking after the body has also by many been connected to rewarding themselves for their discipline, hard work or achievement of specific body management goals (e.g. eating chocolate for reaching the daily step count goal). A few participants, however, have shown resistance against these neoliberal values, and in particular, against the precise monitoring and management of the body with the help of self-tracking devices. Ultimately, however, it seems that most participants have internalised the dominant discourses around self-responsibility, and by association, a sense of duty to discipline their body in normative ways. Even those who have indicated their resistance against body work and body management practices that follow neoliberal thinking and measurement systems, are aware and have internalised the concept of the rational operating mind that needs or should decide how to look after the body in terms of health but also appearance ideals. The discussion around the collected data thereby shows that embodiment and the mind/body dualism constitute two connected, rather than two separate or opposing concepts.

Keywords: dualism, embodiment, mind, neoliberalism

Procedia PDF Downloads 141
2280 Computational Fluid Dynamics Simulation to Study the Effect of Ambient Temperature on the Ventilation in a Metro Tunnel

Authors: Yousef Almutairi, Yajue Wu

Abstract:

Various large-scale trends have characterized the current century thus far, including increasing shifts towards urbanization and greater movement. It is predicted that there will be 9.3 billion people on Earth in 2050 and that over two-thirds of this population will be city dwellers. Moreover, in larger cities worldwide, mass transportation systems, including underground systems, have grown to account for the majority of travel in those settings. Underground networks are vulnerable to fires, however, endangering travellers’ safety, with various examples of fire outbreaks in this setting. This study aims to increase knowledge of the impacts of extreme climatic conditions on fires, including the role of the high ambient temperatures experienced in Middle Eastern countries and specifically in Saudi Arabia. This is an element that is not always included when assessments of fire safety are made (considering visibility, temperatures, and flows of smoke). This paper focuses on a tunnel within Riyadh’s underground system as a case study and includes simulations based on computational fluid dynamics using ANSYS Fluent, which investigates the impact of various ventilation systems while identifying smoke density, speed, pressure and temperatures within this tunnel.

Keywords: fire, subway tunnel, CFD, mechanical ventilation, smoke, temperature, harsh weather

Procedia PDF Downloads 98
2279 Study on the Suppression of Hydrogen Generation by Aluminum-Containing Waste Incineration Ash and Water

Authors: Hideyuki Onodera, Ryoji Imai, Masahiro Sakai

Abstract:

Explosions have occurred in incineration plants in conveyors, ash pits, and other locations. The cause of such explosions is thought to be the reaction of metallic aluminum contained in the ash with water used to cool the ash and prevent scattering, resulting in the generation of hydrogen. Given this background, conveyors and other equipment have been damaged by explosions, which has hindered the stable operation of incineration plants. In addition, workers may be injured by equipment explosions, creating an unsafe situation. To remedy these problems, it is necessary to devise a way to prevent hydrogen explosions from occurring. To overcome this problem, we conducted a hydrogen generation reaction experiment using simulated incinerator ash powder containing aluminum, calcium oxide, and water and confirmed that conditions exist to stop the hydrogen generation reaction. The results of this research may contribute to the suppression of hydrogen explosions at incineration plants.

Keywords: waste incinerated ash, aluminum, water, hydrogen, suppression of hydrogen generation, incineration plant

Procedia PDF Downloads 11
2278 Computational Team Dynamics in Student New Product Development Teams

Authors: Shankaran Sitarama

Abstract:

Teamwork is an extremely effective pedagogical tool in engineering education. New Product Development (NPD) has been an effective strategy of companies to streamline and bring innovative products and solutions to customers. Thus, Engineering curriculum in many schools, some collaboratively with business schools, have brought NPD into the curriculum at the graduate level. Teamwork is invariably used during instruction, where students work in teams to come up with new products and solutions. There is a significant emphasis of grade on the semester long teamwork for it to be taken seriously by students. As the students work in teams and go through this process to develop the new product prototypes, their effectiveness and learning to a great extent depends on how they function as a team and go through the creative process, come together, and work towards the common goal. A core attribute of a successful NPD team is their creativity and innovation. The team needs to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas resulting in a POC (proof-of-concept) implementation or a prototype of the product. The simultaneous requirement of teams to be creative and at the same time also converge and work together imposes different types of tensions in their team interactions. These ideational tensions / conflicts and sometimes relational tensions / conflicts are inevitable. Effective teams will have to deal with the Team dynamics and manage it to be resilient enough and yet be creative. This research paper provides a computational analysis of the teams’ communication that is reflective of the team dynamics, and through a superimposition of latent semantic analysis with social network analysis, provides a computational methodology of arriving at patterns of visual interaction. These team interaction patterns have clear correlations to the team dynamics and provide insights into the functioning and thus the effectiveness of the teams. 23 student NPD teams over 2 years of a course on Managing NPD that has a blend of engineering and business school students is considered, and the results are presented. It is also correlated with the teams’ detailed and tailored individual and group feedback and self-reflection and evaluation questionnaire.

Keywords: team dynamics, social network analysis, team interaction patterns, new product development teamwork, NPD teams

Procedia PDF Downloads 92
2277 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 143
2276 Analysis of Matching Pursuit Features of EEG Signal for Mental Tasks Classification

Authors: Zin Mar Lwin

Abstract:

Brain Computer Interface (BCI) Systems have developed for people who suffer from severe motor disabilities and challenging to communicate with their environment. BCI allows them for communication by a non-muscular way. For communication between human and computer, BCI uses a type of signal called Electroencephalogram (EEG) signal which is recorded from the human„s brain by means of an electrode. The electroencephalogram (EEG) signal is an important information source for knowing brain processes for the non-invasive BCI. Translating human‟s thought, it needs to classify acquired EEG signal accurately. This paper proposed a typical EEG signal classification system which experiments the Dataset from “Purdue University.” Independent Component Analysis (ICA) method via EEGLab Tools for removing artifacts which are caused by eye blinks. For features extraction, the Time and Frequency features of non-stationary EEG signals are extracted by Matching Pursuit (MP) algorithm. The classification of one of five mental tasks is performed by Multi_Class Support Vector Machine (SVM). For SVMs, the comparisons have been carried out for both 1-against-1 and 1-against-all methods.

Keywords: BCI, EEG, ICA, SVM

Procedia PDF Downloads 254
2275 Computational Homogenization of Thin Walled Structures: On the Influence of the Global vs Local Applied Plane Stress Condition

Authors: M. Beusink, E. W. C. Coenen

Abstract:

The increased application of novel structural materials, such as high grade asphalt, concrete and laminated composites, has sparked the need for a better understanding of the often complex, non-linear mechanical behavior of such materials. The effective macroscopic mechanical response is generally dependent on the applied load path. Moreover, it is also significantly influenced by the microstructure of the material, e.g. embedded fibers, voids and/or grain morphology. At present, multiscale techniques are widely adopted to assess micro-macro interactions in a numerically efficient way. Computational homogenization techniques have been successfully applied over a wide range of engineering cases, e.g. cases involving first order and second order continua, thin shells and cohesive zone models. Most of these homogenization methods rely on Representative Volume Elements (RVE), which model the relevant microstructural details in a confined volume. Imposed through kinematical constraints or boundary conditions, a RVE can be subjected to a microscopic load sequence. This provides the RVE's effective stress-strain response, which can serve as constitutive input for macroscale analyses. Simultaneously, such a study of a RVE gives insight into fine scale phenomena such as microstructural damage and its evolution. It has been reported by several authors that the type of boundary conditions applied to the RVE affect the resulting homogenized stress-strain response. As a consequence, dedicated boundary conditions have been proposed to appropriately deal with this concern. For the specific case of a planar assumption for the analyzed structure, e.g. plane strain, axisymmetric or plane stress, this assumption needs to be addressed consistently in all considered scales. Although in many multiscale studies a planar condition has been employed, the related impact on the multiscale solution has not been explicitly investigated. This work therefore focuses on the influence of the planar assumption for multiscale modeling. In particular the plane stress case is highlighted, by proposing three different implementation strategies which are compatible with a first-order computational homogenization framework. The first method consists of applying classical plane stress theory at the microscale, whereas with the second method a generalized plane stress condition is assumed at the RVE level. For the third method, the plane stress condition is applied at the macroscale by requiring that the resulting macroscopic out-of-plane forces are equal to zero. These strategies are assessed through a numerical study of a thin walled structure and the resulting effective macroscale stress-strain response is compared. It is shown that there is a clear influence of the length scale at which the planar condition is applied.

Keywords: first-order computational homogenization, planar analysis, multiscale, microstrucutures

Procedia PDF Downloads 209
2274 Radial Fuel Injection Computational Fluid Dynamics Model for a Compression Ignition Two-Stroke Opposed Piston Engine

Authors: Tytus Tulwin, Rafal Sochaczewski, Ksenia Siadkowska

Abstract:

Designing a new engine requires a large number of different cases to be considered. Especially different injector parameters and combustion chamber geometries. This is essential when developing an engine with unconventional build – compression ignition, two-stroke operating with direct side injection. Computational Fluid Dynamics modelling allows to test those different conditions and seek for the best conditions with correct combustion. This research presents the combustion results for different injector and combustion chamber cases. The shape of combustion chamber is different than for conventional engines as it requires side injection. This completely changes the optimal shape for the given condition compared to standard automotive heart shaped combustion chamber. Because the injection is not symmetrical there is a strong influence of cylinder swirl and piston motion on the injected fuel stream. The results present the fuel injection phenomena allowing to predict the right injection parameters for a maximum combustion efficiency and minimum piston heat loads. Acknowledgement: This work has been realized in the cooperation with The Construction Office of WSK "PZL-KALISZ" S.A." and is part of Grant Agreement No. POIR.01.02.00-00-0002/15 financed by the Polish National Centre for Research and Development.

Keywords: CFD, combustion, injection, opposed piston

Procedia PDF Downloads 249
2273 The Toxic Effects of Kynurenine Metabolites on SH-SY5Y Neuroblastoma Cells

Authors: Susan Hall, Gary D. Grant, Catherine McDermott, Devinder Arora

Abstract:

Introduction /Aim: The kynurenine pathway is thought to play an important role in the pathophysiology of numerous neurodegenerative diseases including depression, Alzheimer’s disease, and Parkinson’s disease. Numerous neuroactive compounds, including the neurotoxic 3-hydroxyanthranilic acid, 3-hydroxykynurenine and quinolinic acid and the neuroprotective kynurenic acid and picolinic acid, are produced through the metabolism of kynurenine and are thought to be the causative agents responsible for neurodegeneration. The toxicity of 3-hydroxykynurenine, 3-hydroxyanthranilic acid and quinolinic acid has been widely evaluated and demonstrated in primary cell cultures but to date only 3-hydroxykynurenine and 3-hydroxyanthranilic acid have been shown to cause toxicity in immortal tumour cells. The aim of this study was to evaluate the toxicity of kynurenine metabolites, both individually and in combination, on SH-SY5Y neuroblastoma cells after 24 and 72 h exposure in order to explore a cost-effective model to study their neurotoxic effects and potential protective agents. Methods: SH-SY5Y neuroblastoma cells were exposed to various concentrations of the neuroactive kynurenine metabolites, both individually and in combination, for 24 and 72 h, and viability was subsequently evaluated using the Resazurin (Alamar blue) proliferation assay. Furthermore, the effects of these compounds, alone and in combination, on specific death pathways including apoptosis, necrosis and free radical production was evaluated using various assays. Results: Consistent with literature, toxicity was shown with short-term 24-hour treatments at 1000 μM concentrations for both 3-hydroxykynurenine and 3-hydroxyanthranilic acid. Combinations of kynurenine metabolites showed modest toxicity towards SH-SY5Y neuroblastoma cells in a concentration-dependent manner. Specific cell death pathways, including apoptosis, necrosis and free radical production were shown to be increased after both 24 and 72 h exposure of SH-SY5Y neuroblastoma cells to 3-hydroxykynurenine and 3-hydroxyanthranilic acid and various combinations of neurotoxic kynurenine metabolites. Conclusion: It is well documented that neurotoxic kynurenine metabolites show toxicity towards primary human neurons in the nanomolar to low micromolar concentration range. Results show that the concentrations required to show significant cell death are in the range of 1000 µM for 3-hydroxykynurenine and 3-hydroxyanthranilic acid and toxicity of quinolinic acid towards SH-SY5Y was unable to be shown. This differs significantly from toxicities observed in primary human neurons. Combinations of the neurotoxic metabolites were shown to have modest toxicity towards these cells with increased toxicity and activation of cell death pathways observed after 72 h exposure. This study suggests that the 24 h model is unsuitable for use in neurotoxicity studies, however, the 72 h model better represents the observations of the studies using primary human neurons and may provide some benefit in providing a cost-effective model to assess possible protective agents against kynurenine metabolite toxicities.

Keywords: kynurenine metabolites, neurotoxicity, quinolinic acid, SH-SY5Y neuroblastoma

Procedia PDF Downloads 399
2272 Modeling the Human Harbor: An Equity Project in New York City, New York USA

Authors: Lauren B. Birney

Abstract:

The envisioned long-term outcome of this three-year research, and implementation plan is for 1) teachers and students to design and build their own computational models of real-world environmental-human health phenomena occurring within the context of the “Human Harbor” and 2) project researchers to evaluate the degree to which these integrated Computer Science (CS) education experiences in New York City (NYC) public school classrooms (PreK-12) impact students’ computational-technical skill development, job readiness, career motivations, and measurable abilities to understand, articulate, and solve the underlying phenomena at the center of their models. This effort builds on the partnership’s successes over the past eight years in developing a benchmark Model of restoration-based Science, Technology, Engineering, and Math (STEM) education for urban public schools and achieving relatively broad-based implementation in the nation’s largest public school system. The Billion Oyster Project Curriculum and Community Enterprise for Restoration Science (BOP-CCERS STEM + Computing) curriculum, teacher professional developments, and community engagement programs have reached more than 200 educators and 11,000 students at 124 schools, with 84 waterfront locations and Out of School of Time (OST) programs. The BOP-CCERS Partnership is poised to develop a more refined focus on integrating computer science across the STEM domains; teaching industry-aligned computational methods and tools; and explicitly preparing students from the city’s most under-resourced and underrepresented communities for upwardly mobile careers in NYC’s ever-expanding “digital economy,” in which jobs require computational thinking and an increasing percentage require discreet computer science technical skills. Project Objectives include the following: 1. Computational Thinking (CT) Integration: Integrate computational thinking core practices across existing middle/high school BOP-CCERS STEM curriculum as a means of scaffolding toward long term computer science and computational modeling outcomes. 2. Data Science and Data Analytics: Enabling Researchers to perform interviews with Teachers, students, community members, partners, stakeholders, and Science, Technology, Engineering, and Mathematics (STEM) industry Professionals. Collaborative analysis and data collection were also performed. As a centerpiece, the BOP-CCERS partnership will expand to include a dedicated computer science education partner. New York City Department of Education (NYCDOE), Computer Science for All (CS4ALL) NYC will serve as the dedicated Computer Science (CS) lead, advising the consortium on integration and curriculum development, working in tandem. The BOP-CCERS Model™ also validates that with appropriate application of technical infrastructure, intensive teacher professional developments, and curricular scaffolding, socially connected science learning can be mainstreamed in the nation’s largest urban public school system. This is evidenced and substantiated in the initial phases of BOP-CCERS™. The BOP-CCERS™ student curriculum and teacher professional development have been implemented in approximately 24% of NYC public middle schools, reaching more than 250 educators and 11,000 students directly. BOP-CCERS™ is a fully scalable and transferable educational model, adaptable to all American school districts. In all settings of the proposed Phase IV initiative, the primary beneficiary group will be underrepresented NYC public school students who live in high-poverty neighborhoods and are traditionally underrepresented in the STEM fields, including African Americans, Latinos, English language learners, and children from economically disadvantaged households. In particular, BOP-CCERS Phase IV will explicitly prepare underrepresented students for skilled positions within New York City’s expanding digital economy, computer science, computational information systems, and innovative technology sectors.

Keywords: computer science, data science, equity, diversity and inclusion, STEM education

Procedia PDF Downloads 39
2271 An Improved Data Aided Channel Estimation Technique Using Genetic Algorithm for Massive Multi-Input Multiple-Output

Authors: M. Kislu Noman, Syed Mohammed Shamsul Islam, Shahriar Hassan, Raihana Pervin

Abstract:

With the increasing rate of wireless devices and high bandwidth operations, wireless networking and communications are becoming over crowded. To cope with such crowdy and messy situation, massive MIMO is designed to work with hundreds of low costs serving antennas at a time as well as improve the spectral efficiency at the same time. TDD has been used for gaining beamforming which is a major part of massive MIMO, to gain its best improvement to transmit and receive pilot sequences. All the benefits are only possible if the channel state information or channel estimation is gained properly. The common methods to estimate channel matrix used so far is LS, MMSE and a linear version of MMSE also proposed in many research works. We have optimized these methods using genetic algorithm to minimize the mean squared error and finding the best channel matrix from existing algorithms with less computational complexity. Our simulation result has shown that the use of GA worked beautifully on existing algorithms in a Rayleigh slow fading channel and existence of Additive White Gaussian Noise. We found that the GA optimized LS is better than existing algorithms as GA provides optimal result in some few iterations in terms of MSE with respect to SNR and computational complexity.

Keywords: channel estimation, LMMSE, LS, MIMO, MMSE

Procedia PDF Downloads 170
2270 Injunctions, Disjunctions, Remnants: The Reverse of Unity

Authors: Igor Guatelli

Abstract:

The universe of aesthetic perception entails impasses about sensitive divergences that each text or visual object may be subjected to. If approached through intertextuality that is not based on the misleading notion of kinships or similarities a priori admissible, the possibility of anachronistic, heterogeneous - and non-diachronic - assemblies can enhance the emergence of interval movements, intermediate, and conflicting, conducive to a method of reading, interpreting, and assigning meaning that escapes the rigid antinomies of the mere being and non-being of things. In negative, they operate in a relationship built by the lack of an adjusted meaning set by their positive existences, with no remainders; the generated interval becomes the remnant of each of them; it is the opening that obscures the stable positions of each one. Without the negative of absence, of that which is always missing or must be missing in a text, concept, or image made positive by history, nothing is perceived beyond what has been already given. Pairings or binary oppositions cannot lead only to functional syntheses; on the contrary, methodological disturbances accumulated by the approximation of signs and entities can initiate a process of becoming as an opening to an unforeseen other, transformation until a moment when the difficulties of [re]conciliation become the mainstay of a future of that sign/entity, not envisioned a priori. A counter-history can emerge from these unprecedented, misadjusted approaches, beginnings of unassigned injunctions and disjunctions, in short, difficult alliances that open cracks in a supposedly cohesive history, chained in its apparent linearity with no remains, understood as a categorical historical imperative. Interstices are minority fields that, because of their opening, are capable of causing opacity in that which, apparently, presents itself with irreducible clarity. Resulting from an incomplete and maladjusted [at the least dual] marriage between the signs/entities that originate them, this interval may destabilize and cause disorder in these entities and their own meanings. The interstitials offer a hyphenated relationship: a simultaneous union and separation, a spacing between the entity’s identity and its otherness or, alterity. One and the other may no longer be seen without the crack or fissure that now separates them, uniting, by a space-time lapse. Ontological, semantic shifts are caused by this fissure, an absence between one and the other, one with and against the other. Based on an improbable approximation between some conceptual and semantic shifts within the design production of architect Rem Koolhaas and the textual production of the philosopher Jacques Derrida, this article questions the notion of unity, coherence, affinity, and complementarity in the process of construction of thought from these ontological, epistemological, and semiological fissures that rattle the signs/entities and their stable meanings. Fissures in a thought that is considered coherent, cohesive, formatted are the negativity that constitutes the interstices that allow us to move towards what still remains as non-identity, which allows us to begin another story.

Keywords: clearing, interstice, negative, remnant, spectrum

Procedia PDF Downloads 119
2269 Teachers’ and Students’ Reactions to a Guided Reading Program Designed by a Teachers’ Professional Learning Community

Authors: Yea-Mei Leou, Shiu-Hsung Huang, T. C. Shen, Chin-Ya Fang

Abstract:

The purposes of this study were to explore how to establish a professional learning community for English teachers at a junior high school, and to explore how teachers and students think about the guided reading program. The participants were three experienced English teachers and their ESL seventh-grade students from three classes in a junior high school. Leveled picture books and worksheets were used in the program. Questionnaires and interviews were used for gathering information. The findings were as follows: First, most students enjoyed this guided reading program. Second, the teachers thought the guided reading program was helpful to students’ learning and the discussions in the professional learning community refreshed their ideas, but the preparation for the teaching was time-consuming. Suggestions based on the findings were provided.

Keywords: ESL students, guided reading, leveled books, professional learning community

Procedia PDF Downloads 356
2268 Multiscale Process Modeling of Ceramic Matrix Composites

Authors: Marianna Maiaru, Gregory M. Odegard, Josh Kemppainen, Ivan Gallegos, Michael Olaya

Abstract:

Ceramic matrix composites (CMCs) are typically used in applications that require long-term mechanical integrity at elevated temperatures. CMCs are usually fabricated using a polymer precursor that is initially polymerized in situ with fiber reinforcement, followed by a series of cycles of pyrolysis to transform the polymer matrix into a rigid glass or ceramic. The pyrolysis step typically generates volatile gasses, which creates porosity within the polymer matrix phase of the composite. Subsequent cycles of monomer infusion, polymerization, and pyrolysis are often used to reduce the porosity and thus increase the durability of the composite. Because of the significant expense of such iterative processing cycles, new generations of CMCs with improved durability and manufacturability are difficult and expensive to develop using standard Edisonian approaches. The goal of this research is to develop a computational process-modeling-based approach that can be used to design the next generation of CMC materials with optimized material and processing parameters for maximum strength and efficient manufacturing. The process modeling incorporates computational modeling tools, including molecular dynamics (MD), to simulate the material at multiple length scales. Results from MD simulation are used to inform the continuum-level models to link molecular-level characteristics (material structure, temperature) to bulk-level performance (strength, residual stresses). Processing parameters are optimized such that process-induced residual stresses are minimized and laminate strength is maximized. The multiscale process modeling method developed with this research can play a key role in the development of future CMCs for high-temperature and high-strength applications. By combining multiscale computational tools and process modeling, new manufacturing parameters can be established for optimal fabrication and performance of CMCs for a wide range of applications.

Keywords: digital engineering, finite elements, manufacturing, molecular dynamics

Procedia PDF Downloads 80
2267 Investigating the Stylistic Features of Advertising: Ad Design and Creation

Authors: Asma Ben Abdallah

Abstract:

Language has a powerful influence over people and their actions. The language of advertising has a very great impact on the consumer. It makes use of different features from the linguistic continuum. The present paper attempts to apply the theories of stylistics to the analysis of advertising texts. In order to decipher the stylistic features of the advertising discourse, 30 advertising text samples designed by MA Business students have been selected. These samples have been analyzed at the level of design and content. The study brings insights into the use of stylistic devices in advertising, and it reveals that both linguistic and non-linguistic features of advertisements are frequently employed to develop a well-thought-out design and content. The practical significance of the study is to highlight the specificities of the advertising genre so that people interested in the language of advertising (Business students and ESP teachers) will have a better understanding of the nature of the language used and the techniques of writing and designing ads. Similarly, those working in the advertising sphere (ad designers) will appreciate the specificities of the advertising discourse.

Keywords: the language of advertising, advertising discourse, ad design, stylistic features

Procedia PDF Downloads 213