Search results for: exact functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3096

Search results for: exact functions

666 Aire-Dependent Transcripts have Shortened 3’UTRs and Show Greater Stability by Evading Microrna-Mediated Repression

Authors: Clotilde Guyon, Nada Jmari, Yen-Chin Li, Jean Denoyel, Noriyuki Fujikado, Christophe Blanchet, David Root, Matthieu Giraud

Abstract:

Aire induces ectopic expression of a large repertoire of tissue-specific antigen (TSA) genes in thymic medullary epithelial cells (MECs), driving immunological self-tolerance in maturing T cells. Although important mechanisms of Aire-induced transcription have recently been disclosed through the identification and the study of Aire’s partners, the fine transcriptional functions underlied by a number of them and conferred to Aire are still unknown. Alternative cleavage and polyadenylation (APA) is an essential mRNA processing step regulated by the termination complex consisting of 85 proteins, 10 of them have been related to Aire. We evaluated APA in MECs in vivo by microarray analysis with mRNA-spanning probes and RNA deep sequencing. We uncovered the preference of Aire-dependent transcripts for short-3’UTR isoforms and for proximal poly(A) site selection marked by the increased binding of the cleavage factor Cstf-64. RNA interference of the 10 Aire-related proteins revealed that Clp1, a member of the core termination complex, exerts a profound effect on short 3’UTR isoform preference. Clp1 is also significantly upregulated in the MECs compared to 25 mouse tissues in which we found that TSA expression is associated with longer 3’UTR isoforms. Aire-dependent transcripts escape a global 3’UTR lengthening associated with MEC differentiation, thereby potentiating the repressive effect of microRNAs that are globally upregulated in mature MECs. Consistent with these findings, RNA deep sequencing of actinomycinD-treated MECs revealed the increased stability of short 3’UTR Aire-induced transcripts, resulting in TSA transcripts accumulation and contributing for their enrichment in the MECs.

Keywords: Aire, central tolerance, miRNAs, transcription termination

Procedia PDF Downloads 379
665 Numerical Investigation on Design Method of Timber Structures Exposed to Parametric Fire

Authors: Robert Pečenko, Karin Tomažič, Igor Planinc, Sabina Huč, Tomaž Hozjan

Abstract:

Timber is favourable structural material due to high strength to weight ratio, recycling possibilities, and green credentials. Despite being flammable material, it has relatively high fire resistance. Everyday engineering practice around the word is based on an outdated design of timber structures considering standard fire exposure, while modern principles of performance-based design enable use of advanced non-standard fire curves. In Europe, standard for fire design of timber structures EN 1995-1-2 (Eurocode 5) gives two methods, reduced material properties method and reduced cross-section method. In the latter, fire resistance of structural elements depends on the effective cross-section that is a residual cross-section of uncharred timber reduced additionally by so called zero strength layer. In case of standard fire exposure, Eurocode 5 gives a fixed value of zero strength layer, i.e. 7 mm, while for non-standard parametric fires no additional comments or recommendations for zero strength layer are given. Thus designers often implement adopted 7 mm rule also for parametric fire exposure. Since the latest scientific evidence suggests that proposed value of zero strength layer can be on unsafe side for standard fire exposure, its use in the case of a parametric fire is also highly questionable and more numerical and experimental research in this field is needed. Therefore, the purpose of the presented study is to use advanced calculation methods to investigate the thickness of zero strength layer and parametric charring rates used in effective cross-section method in case of parametric fire. Parametric studies are carried out on a simple solid timber beam that is exposed to a larger number of parametric fire curves Zero strength layer and charring rates are determined based on the numerical simulations which are performed by the recently developed advanced two step computational model. The first step comprises of hygro-thermal model which predicts the temperature, moisture and char depth development and takes into account different initial moisture states of timber. In the second step, the response of timber beam simultaneously exposed to mechanical and fire load is determined. The mechanical model is based on the Reissner’s kinematically exact beam model and accounts for the membrane, shear and flexural deformations of the beam. Further on, material non-linear and temperature dependent behaviour is considered. In the two step model, the char front temperature is, according to Eurocode 5, assumed to have a fixed temperature of around 300°C. Based on performed study and observations, improved levels of charring rates and new thickness of zero strength layer in case of parametric fires are determined. Thus, the reduced cross section method is substantially improved to offer practical recommendations for designing fire resistance of timber structures. Furthermore, correlations between zero strength layer thickness and key input parameters of the parametric fire curve (for instance, opening factor, fire load, etc.) are given, representing a guideline for a more detailed numerical and also experimental research in the future.

Keywords: advanced numerical modelling, parametric fire exposure, timber structures, zero strength layer

Procedia PDF Downloads 162
664 Legal Theories Underpinning Access to Justice for Victims of Sexual Violence in Refugee Camps in Africa

Authors: O. E. Eberechi, G. P. Stevens

Abstract:

Legal theory has been referred to as the explanation of why things do or do not happen. It also describes situations and why they ensue. It provides a normative framework by which things are regulated and a foundation for the establishment of legal mechanisms/institutions that can bring about a desired change in a society. Furthermore, it offers recommendations in resolving practical problems and describes what the law is, what the law ought to be and defines the legal landscape generally. Some legal theories provide a universal standard, e.g. human rights, while others are capable of organizing and streamlining the collective use, and, by extension, bring order to society. Legal theory is used to explain how the world works and how it does not work. This paper will argue for the application of the principles of legal theory in the achievement of access to justice for female victims of sexual violence in refugee camps in Africa through the analysis of legal theories underpinning the access to justice for these women. It is a known fact that female refugees in camps in Africa often experience some form of sexual violation. The perpetrators of these incidents may never be apprehended, prosecuted, convicted or sentenced. Where prosecution does occur, the perpetrators are either acquitted as a result of poor investigation, inept prosecution, a lack of evidence, or the case may be dismissed owing to tardiness on the part of the prosecutor, which accounts for the culture of impunity in refugee camps. In other words, victims do not have access to the justice that could ameliorate the plight of the victims. There is, thus, a need for a legal framework that will facilitate access to justice for these victims. This paper will start with an introduction, and be followed by the definition of legal theory, its functions and its application in law. Secondly, it will provide a brief explanation of the problems faced by female refugees who are victims of sexual violence in refugee camps in Africa. Thirdly, it will embark on an analysis of theories which will be a help to an understanding of the precarious situation of female refugees, why they are violated, the need for access to justice for these victims, and the principles of legal theory in its usefulness in resolving access to justice for these victims.

Keywords: access to justice, underpinning legal theory, refugee, sexual violence

Procedia PDF Downloads 425
663 An Epidemiological Study on Cutaneous Melanoma, Basocellular and Epidermoid Carcinomas Diagnosed in a Sunny City in Southeast Brazil in a Five-Year Period

Authors: Carolina L. Cerdeira, Julia V. F. Cortes, Maria E. V. Amarante, Gersika B. Santos

Abstract:

Skin cancer is the most common cancer in several parts of the world; in a tropical country like Brazil, the situation isn’t different. The Brazilian population is exposed to high levels of solar radiation, increasing the risk of developing cutaneous carcinoma. Aimed at encouraging prevention measures and the early diagnosis of these tumors, a study was carried out that analyzed data on cutaneous melanomas, basal cell, and epidermoid carcinomas, using as primary data source the medical records of 161 patients registered in one pathology service, which performs skin biopsies in a city of Minas Gerais, Brazil. All patients diagnosed with skin cancer at this service from January 2015 to December 2019 were included. The incidence of skin carcinoma cases was correlated with the identification of histological type, sex, age group, and topographic location. Correlation between variables was verified by Fisher's exact test at a nominal significance level of 5%, with statistical analysis performed by R® software. A significant association was observed between age group and type of cancer (p=0.0085); age group and sex (0.0298); and type of cancer and body region affected (p < 0.01). Those 161 cases analyzed comprised 93 basal cell carcinomas, 66 epidermoid carcinomas, and only two cutaneous melanomas. In the group aged 19 to 30 years, the epidermoid form was most prevalent; from 31 to 45 and from 46 to 59 years, the basal cell prevailed; in 60-year-olds or over, both types had higher frequencies. Associating age group and sex, in groups aged 18 to 30 and 46 to 59 years, women were most affected. In the 31-to 45-year-old group, men predominated. There was a gender balance in the age group 60-year-olds or over. As for topography, there was a high prevalence in the head and neck, followed by upper limbs. Relating histological type and topography, there was a prevalence of basal cell and epidermoid carcinomas in the head and neck. In the chest, the basal cell form was most prevalent; in upper limbs, the epidermoid form prevailed. Cutaneous melanoma affected only the chest and upper limbs. About 82% of patients 60-year-olds or over had head and neck cancer; from 46 to 59 and 60-year-olds or over, the head and neck region and upper limbs were predominantly affected; the distribution was balanced in the 31-to 45-year-old group. In conclusion, basal cell carcinoma was predominant, whereas cutaneous melanoma was the rarest among the types analyzed. Patients 60-year-olds or over were most affected, showing gender balance. In young adults, there was a prevalence of the epidermoid form; in middle-aged patients, basal cell carcinoma was predominant; in the elderly, both forms presented with higher frequencies. There was a higher incidence of head and neck cancers, followed by malignancies affecting the upper limbs. The epidermoid type manifested significantly in the upper limbs. Body regions such as the thorax and lower limbs were less affected, which is justified by the lower exposure of these areas to incident solar radiation.

Keywords: basal cell carcinoma, cutaneous melanoma, skin cancer, squamous cell carcinoma, topographic location

Procedia PDF Downloads 127
662 Comparative Study of Outcome of Patients with Wilms Tumor Treated with Upfront Chemotherapy and Upfront Surgery in Alexandria University Hospitals

Authors: Golson Mohamed, Yasmine Gamasy, Khaled EL-Khatib, Anas Al-Natour, Shady Fadel, Haytham Rashwan, Haytham Badawy, Nadia Farghaly

Abstract:

Introduction: Wilm's tumor is the most common malignant renal tumor in children. Much progress has been made in the management of patients with this malignancy over the last 3 decades. Today treatments are based on several trials and studies conducted by the International Society of Pediatric Oncology (SIOP) in Europe and National Wilm's Tumor Study Group (NWTS) in the USA. It is necessary for us to understand why do we follow either of the protocols, NWTS which follows the upfront surgery principle or the SIOP which follows the upfront chemotherapy principle in all stages of the disease. Objective: The aim of is to assess outcome in patients treated with preoperative chemotherapy and patients treated with upfront surgery to compare their effect on overall survival. Study design: to decide which protocol to follow, study was carried out on records for patients aged 1 day to 18 years old suffering from Wilm's tumor who were admitted to Alexandria University Hospital, pediatric oncology, pediatric urology and pediatric surgery departments, with a retrospective survey records from 2010 to 2015, Design and editing of the transfer sheet with a (PRISMA flow study) Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Data were fed to the computer and analyzed using IBM SPSS software package version 20.0. (11) Qualitative data were described using number and percent. Quantitative data were described using Range (minimum and maximum), mean, standard deviation and median. Comparison between different groups regarding categorical variables was tested using Chi-square test. When more than 20% of the cells have expected count less than 5, correction for chi-square was conducted using Fisher’s Exact test or Monte Carlo correction. The distributions of quantitative variables were tested for normality using Kolmogorov-Smirnov test, Shapiro-Wilk test, and D'Agstino test, if it reveals normal data distribution, parametric tests were applied. If the data were abnormally distributed, non-parametric tests were used. For normally distributed data, a comparison between two independent populations was done using independent t-test. For abnormally distributed data, comparison between two independent populations was done using Mann-Whitney test. Significance of the obtained results was judged at the 5% level. Results: A significantly statistical difference was observed for survival between the two studied groups favoring the upfront chemotherapy(86.4%)as compared to the upfront surgery group (59.3%) where P=0.009. As regard complication, 20 cases (74.1%) out of 27 were complicated in the group of patients treated with upfront surgery. Meanwhile, 30 cases (68.2%) out of 44 had complications in patients treated with upfront chemotherapy. Also, the incidence of intraoperative complication (rupture) was less in upfront chemotherapy group as compared to upfront surgery group. Conclusion: Upfront chemotherapy has superiority over upfront surgery.As the patient who started with upfront chemotherapy shown, higher survival rate, less percent in complication, less percent needed for radiotherapy, and less rate in recurrence.

Keywords: Wilm's tumor, renal tumor, chemotherapy, surgery

Procedia PDF Downloads 316
661 Technological and Economic Investigation of Concentrated Photovoltaic and Thermal Systems: A Case Study of Iran

Authors: Moloud Torkandam

Abstract:

Any cities must be designed and built in a way that minimizes their need for fossil fuel. Undoubtedly, the necessity of accepting this principle in the previous eras is undeniable with respect to the mode of constructions. Perhaps only due to the great diversity of materials and new technologies in the contemporary era, such a principle in buildings has been forgotten. The question of optimizing energy consumption in buildings has attracted a great deal of attention in many countries and, in this way, they have been able to cut down the consumption of energy up to 30 percent. The energy consumption is remarkably higher than global standards in our country, and the most important reason is the undesirable state of buildings from the standpoint of energy consumption. In addition to providing the means to protect the natural and fuel resources for the future generations, reducing the use of fossil energies may also bring about desirable outcomes such as the decrease in greenhouse gases (whose emissions cause global warming, the melting of polar ice, the rise in sea level and the climatic changes of the planet earth), the decrease in the destructive effects of contamination in residential complexes and especially urban environments and preparation for national self-sufficiency and the country’s independence and preserving national capitals. This research realize that in this modern day and age, living sustainably is a pre-requisite for ensuring a bright future and high quality of life. In acquiring this living standard, we will maintain the functions and ability of our environment to serve and sustain our livelihoods. Electricity is now an integral part of modern life, a basic necessity. In the provision of electricity, we are committed to respecting the environment by reducing the use of fossil fuels through the use of proven technologies that use local renewable and natural resources as its energy source. As far as this research concerned it is completely necessary to work on different type of energy producing such as solar and CPVT system.

Keywords: energy, photovoltaic, termal system, solar energy, CPVT

Procedia PDF Downloads 80
660 Neuroprotective Effect of Hypericum Perforatum against Neurotoxicity and Alzheimer's Disease (Experimental Study in Mice)

Authors: Khayra Zerrouki, Noureddine Djebli, Esra Eroglu, Afife Mat, Ozhan Gul

Abstract:

Neurodegenerative diseases of the human brain comprise a variety of disorders that affect an increasing percentage of the population. Alzheimer’s disease (AD) is a complex, multifactorial, heterogeneous mental illness, which is characterized by an age-dependent loss of memory and an impairment of multiple cognitive functions, but this 10 last years it concerns the population most and most young. Hypericum perforatum has traditionally been used as an external anti-inflammatory and healing remedy for the treatment of swellings, wounds and burns, diseases of the alimentary tract and psychological disorders. It is currently of great interest due to new and important therapeutic applications. In this study, the chemical composition of methanolic extract of Hypericum perforatum (HPM) was analysed by using high performance liquid chromatography – diode array detector (HPLC-DAD). The in vitro antioxidant activity of HPM was evaluated by using several antioxidant tests. HSM exhibits inhibitory capacity against posphatidylcholine liposome peroxidation, induced with iron and ascorbic acid, scavenge DPPH and superoxide radicals and act as reductants. The cytotoxic activity of HSM was also determined by using MTT cell viability assay on HeLa and NRK-52E cell lines. The in vivo activity studies in Swiss mice were determined by using behavioral, memory tests and histological study. According to tests results HPM that may be relevant to the treatment of cognitive disorders. The results of chemical analysis showed a hight level of hyperforin and quercitin that had an important antioxidant activity proved in vitro with the DPPH, anti LPO and SOD; this antioxidant activity was confirmed in vivo after the non-toxic results by means of improvement in behavioral and memory than the reducing shrunken in pyramidal cells of mice brains.

Keywords: AlCl3, alzheimer, mice, neuroprotective, neurotoxicity, phytotherapy

Procedia PDF Downloads 497
659 Numerical Solution of Portfolio Selecting Semi-Infinite Problem

Authors: Alina Fedossova, Jose Jorge Sierra Molina

Abstract:

SIP problems are part of non-classical optimization. There are problems in which the number of variables is finite, and the number of constraints is infinite. These are semi-infinite programming problems. Most algorithms for semi-infinite programming problems reduce the semi-infinite problem to a finite one and solve it by classical methods of linear or nonlinear programming. Typically, any of the constraints or the objective function is nonlinear, so the problem often involves nonlinear programming. An investment portfolio is a set of instruments used to reach the specific purposes of investors. The risk of the entire portfolio may be less than the risks of individual investment of portfolio. For example, we could make an investment of M euros in N shares for a specified period. Let yi> 0, the return on money invested in stock i for each dollar since the end of the period (i = 1, ..., N). The logical goal here is to determine the amount xi to be invested in stock i, i = 1, ..., N, such that we maximize the period at the end of ytx value, where x = (x1, ..., xn) and y = (y1, ..., yn). For us the optimal portfolio means the best portfolio in the ratio "risk-return" to the investor portfolio that meets your goals and risk ways. Therefore, investment goals and risk appetite are the factors that influence the choice of appropriate portfolio of assets. The investment returns are uncertain. Thus we have a semi-infinite programming problem. We solve a semi-infinite optimization problem of portfolio selection using the outer approximations methods. This approach can be considered as a developed Eaves-Zangwill method applying the multi-start technique in all of the iterations for the search of relevant constraints' parameters. The stochastic outer approximations method, successfully applied previously for robotics problems, Chebyshev approximation problems, air pollution and others, is based on the optimal criteria of quasi-optimal functions. As a result we obtain mathematical model and the optimal investment portfolio when yields are not clear from the beginning. Finally, we apply this algorithm to a specific case of a Colombian bank.

Keywords: outer approximation methods, portfolio problem, semi-infinite programming, numerial solution

Procedia PDF Downloads 306
658 Determination of Crustal Structure and Moho Depth within the Jammu and Kashmir Region, Northwest Himalaya through Receiver Function

Authors: Shiv Jyoti Pandey, Shveta Puri, G. M. Bhat, Neha Raina

Abstract:

The Jammu and Kashmir (J&K) region of Northwest Himalaya has a long history of earthquake activity which falls within Seismic Zones IV and V. To know the crustal structure beneath this region, we utilized teleseismic receiver function method. This paper presents the results of the analyses of the teleseismic earthquake waves recorded by 10 seismic observatories installed in the vicinity of major thrusts and faults. The teleseismic waves at epicentral distance between 30o and 90o with moment magnitudes greater than or equal to 5.5 that contains large amount of information about the crust and upper mantle structure directly beneath a receiver has been used. The receiver function (RF) technique has been widely applied to investigate crustal structures using P-to-S converted (Ps) phases from velocity discontinuities. The arrival time of the Ps, PpPs and PpSs+ PsPs converted and reverberated phases from the Moho can be combined to constrain the mean crustal thickness and Vp/Vs ratio. Over 500 receiver functions from 10 broadband stations located in the Jammu & Kashmir region of Northwest Himalaya were analyzed. With the help of H-K stacking method, we determined the crustal thickness (H) and average crustal Vp/Vs ratio (K) in this region. We also used Neighbourhood algorithm technique to verify our results. The receiver function results for these stations show that the crustal thickness under Jammu & Kashmir ranges from 45.0 to 53.6 km with an average value of 50.01 km. The Vp/Vs ratio varies from 1.63 to 1.99 with an average value of 1.784 which corresponds to an average Poisson’s ratio of 0.266 with a range from 0.198 to 0.331. High Poisson’s ratios under some stations may be related to partial melting in the crust near the uppermost mantle. The crustal structure model developed from this study can be used to refine the velocity model used in the precise epicenter location in the region, thereby increasing the knowledge to understand current seismicity in the region.

Keywords: H-K stacking, Poisson’s ratios, receiver function, teleseismic

Procedia PDF Downloads 245
657 The Optimal Utilization of Centrally Located Land: The Case of the Bloemfontein Show Grounds

Authors: D. F. Coetzee, M. M. Campbell

Abstract:

The urban environment is constantly expanding and the optimal use of centrally located land is important in terms of sustainable development. Bloemfontein has expanded and this affects land-use functions. The purpose of the study is to examine the possible shift in location of the Bloemfontein show grounds to utilize the space of the grounds more effectively in context of spatial planning. The research method used is qualitative case study research with the case study on the Bloemfontein show grounds. The purposive sample consisted of planners who work or consult in the Bloemfontein area and who are registered with the South African Council for Planners (SACPLAN). Interviews consisting of qualitative open-ended questionnaires were used. When considering relocation the social and economic aspects need to be considered. The findings also indicated a majority consensus that the property can be utilized more effectively in terms of mixed land use. The showground development trust compiled a master plan to ensure that the property is used to its full potential without the relocation of the showground function itself. This Master Plan can be seen as the next logical step for the showground property itself, and it is indeed an attempt to better utilize the land parcel without relocating the show function. The question arises whether the proposed Master Plan is a permanent solution or whether it is merely delaying the relocation of the core showground function to another location. For now, it is a sound solution, making the best out of the situation at hand and utilizing the property more effectively. If the show grounds were to be relocated the researcher proposed a recommendation of mixed-use development, in terms an expansion on the commercial business/retail, together with a sport and recreation function. The show grounds in Bloemfontein are well positioned to capitalize on and to meet the needs of the changing economy, while complimenting the future economic growth strategies of the city if the right plans are in place.

Keywords: centrally located land, spatial planning, show grounds, central business district

Procedia PDF Downloads 409
656 Psychopathic Disorders and Judges Sentencing: Can Neurosciences Change this Aggravating Factor in a Mitigating Factor?

Authors: Kevin Moustapha

Abstract:

Psychopathy is perceived today as being «the most important concept in the criminal justice system» and as «the most important legal notion of the early 21 th century». The explosion of research related to psychopathy seems to perfectly illustrate this trend. Traditionally, many studies tend to focus on links between insanity defense and psychopathy. That is why our purpose in this article is to analyze psychopathic disorders in the scope of judges sentencing in Canada. Indeed, in every Canadian case related to dangerous offenders, judges must balance between fairness and protection of the individuals rights of the accused and protection of society from dangerous predators who may commit future acts of physical or sexual violence. Increasingly, psychopathic disorders are taking an important part in judge sentencing, especially in Canada. This phenomenon can be illustrated by the high proportion of psychopath offenders incarcerated in North American prisons. Many decisions in Canadians courtrooms seem to point out that psychopathy is often used as a strong argument by the judges to preserve public safety. The fact that psychopathy is often associated with violence, recklessness and recidivism, it could explain why many judges consider psychopathic disorders as an aggravating factor. Generally, the judge reasoning is based on article 753 of Canadian Criminal Code related to dangerous offenders, which is used for individuals who show a pattern of repetitive and persistent aggressive behaviour. However, with cognitive neurosciences, the psychopath’s situation in courtrooms would probably change. Cerebral imaging and news data provided by the neurosciences show that emotional and volitional functions in psychopath’s brains are impaired. Understanding these new issues could enable some judges to recognize psychopathic disorders as a mitigating factor. Two important questions ought to be raised in this article: can exploring psychopaths ‘brains really change the judge sentencing in Canadian courtrooms? If yes, can judges consider psychopathy more as a mitigating factor than an aggravating factor?

Keywords: criminal law, judges sentencing, neurosciences, psychopathy

Procedia PDF Downloads 923
655 Exploring Forest Biomass Changes in Romania in the Last Three Decades

Authors: Remus Pravalie, Georgeta Bandoc

Abstract:

Forests are crucial for humanity and biodiversity, through the various ecosystem services and functions they provide all over the world. Forest ecosystems are vital in Romania as well, through their various benefits, known as provisioning (food, wood, or fresh water), regulating (water purification, soil protection, carbon sequestration or control of climate change, floods, and other hazards), cultural (aesthetic, spiritual, inspirational, recreational or educational benefits) and supporting (primary production, nutrient cycling, and soil formation processes, with direct or indirect importance for human well-being) ecosystem services. These ecological benefits are of great importance in Romania, especially given the fact that forests cover extensive areas countrywide, i.e. ~6.5 million ha or ~27.5% of the national territory. However, the diversity and functionality of these ecosystem services fundamentally depend on certain key attributes of forests, such as biomass, which has so far not been studied nationally in terms of potential changes due to climate change and other driving forces. This study investigates, for the first time, changes in forest biomass in Romania in recent decades, based on a high volume of satellite data (Landsat images at high spatial resolutions), downloaded from the Google Earth Engine platform and processed (using specialized software and methods) across Romanian forestland boundaries from 1987 to 2018. A complex climate database was also investigated across Romanian forests over the same 32-year period, in order to detect potential similarities and statistical relationships between the dynamics of biomass and climate data. The results obtained indicated considerable changes in forest biomass in Romania in recent decades, largely triggered by the climate change that affected the country after 1987. Findings on the complex pattern of recent forest changes in Romania, which will be presented in detail in this study, can be useful to national policymakers in the fields of forestry, climate, and sustainable development.

Keywords: forests, biomass, climate change, trends, romania

Procedia PDF Downloads 151
654 A Collaborative Problem Driven Approach to Design an HR Analytics Application

Authors: L. Atif, C. Rosenthal-Sabroux, M. Grundstein

Abstract:

The requirements engineering process is a crucial phase in the design of complex systems. The purpose of our research is to present a collaborative problem-driven requirements engineering approach that aims at improving the design of a Decision Support System as an Analytics application. This approach has been adopted to design a Human Resource management DSS. The Requirements Engineering process is presented as a series of guidelines for activities that must be implemented to assure that the final product satisfies end-users requirements and takes into account the limitations identified. For this, we know that a well-posed statement of the problem is “a problem whose crucial character arises from collectively produced estimation and a formulation found to be acceptable by all the parties”. Moreover, we know that DSSs were developed to help decision-makers solve their unstructured problems. So, we thus base our research off of the assumption that developing DSS, particularly for helping poorly structured or unstructured decisions, cannot be done without considering end-user decision problems, how to represent them collectively, decisions content, their meaning, and the decision-making process; thus, arise the field issues in a multidisciplinary perspective. Our approach addresses a problem-driven and collaborative approach to designing DSS technologies: It will reflect common end-user problems in the upstream design phase and in the downstream phase these problems will determine the design choices and potential technical solution. We will thus rely on a categorization of HR’s problems for a development mirroring the Analytics solution. This brings out a new data-driven DSS typology: Descriptive Analytics, Explicative or Diagnostic Analytics, Predictive Analytics, Prescriptive Analytics. In our research, identifying the problem takes place with design of the solution, so, we would have to resort a significant transformations of representations associated with the HR Analytics application to build an increasingly detailed representation of the goal to be achieved. Here, the collective cognition is reflected in the establishment of transfer functions of representations during the whole of the design process.

Keywords: DSS, collaborative design, problem-driven requirements, analytics application, HR decision making

Procedia PDF Downloads 292
653 Discourse Analysis and Semiotic Researches: Using Michael Halliday's Sociosemiotic Theory

Authors: Deyu Yuan

Abstract:

Discourse analysis as an interdisciplinary approach has more than 60-years-history since it was first named by Zellig Harris in 'Discourse Analysis' on Language in 1952. Ferdinand de Saussure differentiated the 'parole' from the 'langue' that established the principle of focusing on language but not speech. So the rising of discourse analysis can be seen as a discursive turn for the entire language research that closely related to the theory of Speech act. Critical discourse analysis becomes the mainstream of contemporary language research through drawing upon M. A. K. Halliday's socio-semiotic theory and Foucault, Barthes, Bourdieu's views on the sign, discourse, and ideology. So in contrast to general semiotics, social semiotics mainly focuses on parole and the application of semiotic theories to some applicable fields. The article attempts to discuss this applicable sociosemiotics and show the features of it that differ from the Saussurian and Peircian semiotics in four aspects: 1) the sign system is about meaning-generation resource in the social context; 2) the sign system conforms to social and cultural changes with the form of metaphor and connotation; 3) sociosemiotics concerns about five applicable principles including the personal authority principle, non-personal authority principle, consistency principle, model demonstration principle, the expertise principle to deepen specific communication; 4) the study of symbolic functions is targeted to the characteristics of ideational, interpersonal and interactional function in social communication process. Then the paper describes six features which characterize this sociosemiotics as applicable semiotics: social, systematic, usable interdisciplinary, dynamic, and multi-modal characteristics. Thirdly, the paper explores the multi-modal choices of sociosemiotics in the respects of genre, discourse, and style. Finally, the paper discusses the relationship between theory and practice in social semiotics and proposes a relatively comprehensive theoretical framework for social semiotics as applicable semiotics.

Keywords: discourse analysis, sociosemiotics, pragmatics, ideology

Procedia PDF Downloads 344
652 A Pragmatic Approach of Memes Created in Relation to the COVID-19 Pandemic

Authors: Alexandra-Monica Toma

Abstract:

Internet memes are an element of computer mediated communication and an important part of online culture that combines text and image in order to generate meaning. This term coined by Richard Dawkings refers to more than a mere way to briefly communicate ideas or emotions, thus naming a complex and an intensely perpetuated phenomenon in the virtual environment. This paper approaches memes as a cultural artefact and a virtual trope that mirrors societal concerns and issues, and analyses the pragmatics of their use. Memes have to be analysed in series, usually relating to some image macros, which is proof of the interplay between imitation and creativity in the memes’ writing process. We believe that their potential to become viral relates to three key elements: adaptation to context, reference to a successful meme series, and humour (jokes, irony, sarcasm), with various pragmatic functions. The study also uses the concept of multimodality and stresses how the memes’ text interacts with the image, discussing three types of relations: symmetry, amplification, and contradiction. Moreover, the paper proves that memes could be employed as speech acts with illocutionary force, when the interaction between text and image is enriched through the connection to a specific situation. The features mentioned above are analysed in a corpus that consists of memes related to the COVID-19 pandemic. This corpus shows them to be highly adaptable to context, which helps build the feeling of connection and belonging in an otherwise tremendously fragmented world. Some of them are created based on well-known image macros, and their humour results from an intricate dialogue between texts and contexts. Memes created in relation to the COVID-19 pandemic can be considered speech acts and are often used as such, as proven in the paper. Consequently, this paper tackles the key features of memes, makes a thorough analysis of the memes sociocultural, linguistic, and situational context, and emphasizes their intertextuality, with special accent on their illocutionary potential.

Keywords: context, memes, multimodality, speech acts

Procedia PDF Downloads 196
651 Indoor Air Quality Analysis for Renovating Building: A Case Study of Student Studio, Department of Landscape, Chiangmai, Thailand

Authors: Warangkana Juangjandee

Abstract:

The rapidly increasing number of population in the limited area creates an effect on the idea of the improvement of the area to suit the environment and the needs of people. Faculty of architecture Chiang Mai University is also expanding in both variety fields of study and quality of education. In 2020, the new department will be introduced in the faculty which is Department of Landscape Architecture. With the limitation of the area in the existing building, the faculty plan to renovate some parts of its school for anticipates the number of students who will join the program in the next two years. As a result, the old wooden workshop area is selected to be renovated as student studio space. With such condition, it is necessary to study the restriction and the distinctive environment of the site prior to the improvement in order to find ways to manage the existing space due to the fact that the primary functions that have been practiced in the site, an old wooden workshop space and the new function, studio space, are too different. 72.9% of the annual times in the room are considered to be out of the thermal comfort condition with high relative humidity. This causes non-comfort condition for occupants which could promote mould growth. This study aims to analyze thermal comfort condition in the Landscape Learning Studio Area for finding the solution to improve indoor air quality and respond to local conditions. The research methodology will be in two parts: 1) field gathering data on the case study 2) analysis and finding the solution of improving indoor air quality. The result of the survey indicated that the room needs to solve non-comfort condition problem. This can be divided into two ways which are raising ventilation and indoor temperature, e.g. improving building design and stack driven ventilation, using fan for enhancing more internal ventilation.

Keywords: relative humidity, renovation, temperature, thermal comfort

Procedia PDF Downloads 210
650 Dynamic Control Theory: A Behavioral Modeling Approach to Demand Forecasting amongst Office Workers Engaged in a Competition on Energy Shifting

Authors: Akaash Tawade, Manan Khattar, Lucas Spangher, Costas J. Spanos

Abstract:

Many grids are increasing the share of renewable energy in their generation mix, which is causing the energy generation to become less controllable. Buildings, which consume nearly 33% of all energy, are a key target for demand response: i.e., mechanisms for demand to meet supply. Understanding the behavior of office workers is a start towards developing demand response for one sector of building technology. The literature notes that dynamic computational modeling can be predictive of individual action, especially given that occupant behavior is traditionally abstracted from demand forecasting. Recent work founded on Social Cognitive Theory (SCT) has provided a promising conceptual basis for modeling behavior, personal states, and environment using control theoretic principles. Here, an adapted linear dynamical system of latent states and exogenous inputs is proposed to simulate energy demand amongst office workers engaged in a social energy shifting game. The energy shifting competition is implemented in an office in Singapore that is connected to a minigrid of buildings with a consistent 'price signal.' This signal is translated into a 'points signal' by a reinforcement learning (RL) algorithm to influence participant energy use. The dynamic model functions at the intersection of the points signals, baseline energy consumption trends, and SCT behavioral inputs to simulate future outcomes. This study endeavors to analyze how the dynamic model trains an RL agent and, subsequently, the degree of accuracy to which load deferability can be simulated. The results offer a generalizable behavioral model for energy competitions that provides the framework for further research on transfer learning for RL, and more broadly— transactive control.

Keywords: energy demand forecasting, social cognitive behavioral modeling, social game, transfer learning

Procedia PDF Downloads 103
649 Simulation Study on Effects of Surfactant Properties on Surfactant Enhanced Oil Recovery from Fractured Reservoirs

Authors: Xiaoqian Cheng, Jon Kleppe, Ole Torsaeter

Abstract:

One objective of this work is to analyze the effects of surfactant properties (viscosity, concentration, and adsorption) on surfactant enhanced oil recovery at laboratory scale. The other objective is to obtain the functional relationships between surfactant properties and the ultimate oil recovery and oil recovery rate. A core is cut into two parts from the middle to imitate the matrix with a horizontal fracture. An injector and a producer are at the left and right sides of the fracture separately. The middle slice of the core is used as the model in this paper, whose size is 4cm x 0.1cm x 4.1cm, and the space of the fracture in the middle is 0.1 cm. The original properties of matrix, brine, oil in the base case are from Ekofisk Field. The properties of surfactant are from literature. Eclipse is used as the simulator. The results are followings: 1) The viscosity of surfactant solution has a positive linear relationship with surfactant oil recovery time. And the relationship between viscosity and oil production rate is an inverse function. The viscosity of surfactant solution has no obvious effect on ultimate oil recovery. Since most of the surfactant has no big effect on viscosity of brine, the viscosity of surfactant solution is not a key parameter of surfactant screening for surfactant flooding in fractured reservoirs. 2) The increase of surfactant concentration results a decrease of oil recovery rate and an increase of ultimate oil recovery. However, there are no functions could describe the relationships. Study on economy should be conducted because of the price of surfactant and oil. 3) In the study of surfactant adsorption, assume that the matrix wettability is changed to water-wet when the surfactant adsorption is to the maximum at all cases. And the ratio of surfactant adsorption and surfactant concentration (Cads/Csurf) is used to estimate the functional relationship. The results show that the relationship between ultimate oil recovery and Cads/Csurf is a logarithmic function. The oil production rate has a positive linear relationship with exp(Cads/Csurf). The work here could be used as a reference for the surfactant screening of surfactant enhanced oil recovery from fractured reservoirs. And the functional relationships between surfactant properties and the oil recovery rate and ultimate oil recovery help to improve upscaling methods.

Keywords: fractured reservoirs, surfactant adsorption, surfactant concentration, surfactant EOR, surfactant viscosity

Procedia PDF Downloads 170
648 Extraction and Encapsulation of Carotenoids from Carrot

Authors: Gordana Ćetković, Sanja Podunavac-Kuzmanović, Jasna Čanadanović-Brunet, Vesna Tumbas Šaponjac, Vanja Šeregelj, Jelena Vulić, Slađana Stajčić

Abstract:

The color of food is one of the decisive factors for consumers. Potential toxicity of artificial food colorants has led to the consumers' preference for natural products over products with artificial colors. Natural pigments have many bioactive functions, such as antioxidant, provitamin and many other. Having this in mind, the acceptability of natural colorants by the consumers is much higher. Being present in all photosynthetic plant tissues carotenoids are probably most widespread pigments in nature. Carrot (Daucus carota) is a good source of functional food components. Carrot is especially rich in carotenoids, mainly α- and β-carotene and lutein. For this study, carrot was extracted using classical extraction with hexane and ethyl acetate, as well as supercritical CO₂ extraction. The extraction efficiency was evaluated by estimation of carotenoid yield determined spectrophotometrically. Classical extraction using hexane (18.27 mg β-carotene/100 g DM) was the most efficient method for isolation of carotenoids, compared to ethyl acetate classical extraction (15.73 mg β-carotene/100 g DM) and supercritical CO₂ extraction (0.19 mg β-carotene/100 g DM). Three carrot extracts were tested in terms of antioxidant activity using DPPH and reducing power assay as well. Surprisingly, ethyl acetate extract had the best antioxidant activity on DPPH radicals (AADPPH=120.07 μmol TE/100 g) while hexane extract showed the best reducing power (RP=1494.97 μmol TE/100 g). Hexane extract was chosen as the most potent source of carotenoids and was encapsulated in whey protein by freeze-drying. Carotenoid encapsulation efficiency was found to be high (89.33%). Based on our results it can be concluded that carotenoids from carrot can be efficiently extracted using hexane and classical extraction method. This extract has the potential to be applied in encapsulated form due to high encapsulation efficiency and coloring capacity. Therefore it can be used for dietary supplements development and food fortification.

Keywords: carotenoids, carrot, extraction, encapsulation

Procedia PDF Downloads 268
647 Theory of Negative Trigger: The Contract between Oral Probiotics and Immune System

Authors: Cliff Shunsheng Han

Abstract:

Identifying the direct allergy cause that can be easily mitigated is the foundation to stop the allergy epidemic that has been started in the seventies. It has confirmed that the personal and social hygiene practices are associated with the allergy prevalence. But direct causes have been found, and proposed translational measures have not been effective. This study, assisted by a particular case of allergies, has seen the direct cause of allergies, developed a valid test resulted in lasting relief for allergies, and constructed theory describing general relationship between microbiota and host immune system. Saliva samples were collected from a subject for three years during which time the person experienced yearlong allergy, seasonal allergy, and remission of allergy symptoms. Bacterial DNA was extracted and 16S rRNA genes were profiled with Illumina sequencing technology. The analyzing results indicate that the possible direct cause of allergy is the lacking probiotic bacteria in the oral cavity, such as genera Streptococcus and Veilonella, that can produce metabolites to pacify immune system. Targeted promotion of those bacteria with a compound designed for them, has led to lasting remissions of allergic rhinitis. During the development of the translational measure, the subject's oral biofilm was completely destructed by a moderate fever due to an unrelated respiratory infection. The incident not only facilitated the development of the heat based microbiota reseeding procedure but also indicated a possible natural switch that subsequently increases the efficacy of the immune system previously restrained by metabolites from microbiota. These results lead to the proposal of a Theory of Negative Trigger (TNT) to describe the relationship between oral probiotics and immune system, in which probiotics are the negative trigger that will release the power of immune system when removed by fever or modern lifestyles. This study could open doors leading to further understanding of how the immune system functions under the influence of microbiota as well as validate simple traditional practices for healthy living.

Keywords: oral microbiome, allergy, immune system, infection

Procedia PDF Downloads 126
646 Executive Deficits in Non-Clinical Hoarders

Authors: Thomas Heffernan, Nick Neave, Colin Hamilton, Gill Case

Abstract:

Hoarding is the acquisition of and failure to discard possessions, leading to excessive clutter and significant psychological/emotional distress. From a cognitive-behavioural approach, excessive hoarding arises from information-processing deficits, as well as from problems with emotional attachment to possessions and beliefs about the nature of possessions. In terms of information processing, hoarders have shown deficits in executive functions, including working memory, planning, inhibitory control, and cognitive flexibility. However, this previous research is often confounded by co-morbid factors such as anxiety, depression, or obsessive-compulsive disorder. The current study adopted a cognitive-behavioural approach, specifically assessing executive deficits and working memory in a non-clinical sample of hoarders, compared with non-hoarders. In this study, a non-clinical sample of 40 hoarders and 73 non-hoarders (defined by The Savings Inventory-Revised) completed the Adult Executive Functioning Inventory, which measures working memory and inhibition, Dysexecutive Questionnaire-Revised, which measures general executive function and the Hospital Anxiety and Depression Scale, which measures mood. The participant sample was made up of unpaid young adult volunteers who were undergraduate students and who completed the questionnaires on a university campus. The results revealed that, after observing no differences between hoarders and non-hoarders on age, sex, and mood, hoarders reported significantly more deficits in inhibitory control and general executive function when compared with non-hoarders. There was no between-group difference on general working memory. This suggests that non-clinical hoarders have a specific difficulty with inhibition-control, which enables you to resist repeated, unwanted urges. This might explain the hoarder’s inability to resist urges to buy and keep items that are no longer of any practical use. These deficits may be underpinned by general executive function deficiencies.

Keywords: hoarding, memory, executive, deficits

Procedia PDF Downloads 189
645 Phenotypical and Genotypical Diagnosis of Cystic Fibrosis in 26 Cases from East and South Algeria

Authors: Yahia Massinissa, Yahia Mouloud

Abstract:

Cystic fibrosis (CF), the most common lethal genetic disease in the Europe population, is caused by mutations in the transmembrane conductance regulator gene (CFTR). It affects most organs including an epithelial tissue, base of hydroelectrolytic transepithelial transport, notably that aerial ways, the pancreas, the biliary ways, the intestine, sweat glands and the genital tractus. The gene whose anomalies are responsible of the cystic fibrosis codes for a protein Cl channel named CFTR (cystic fibrosis transmembrane conductance regulator) that exercises multiple functions in the cell, one of the most important in control of sodium and chlorine through epithelia. The deficient function translates itself notably by an abnormal production of viscous secretion that obstructs the execrator channels of this target organ: one observes then a dilatation, an inflammation and an atrophy of these organs. It also translates itself by an increase of the concentration in sodium and in chloride in sweat, to the basis of the sweat test. In order to do a phenotypical and genotypical diagnosis at a part of the Algerian population, our survey has been carried on 16 patients with evocative symptoms of the cystic fibrosis at that the clinical context has been confirmed by a sweat test. However, anomalies of the CFTR gene have been determined by electrophoresis in gel of polyacrylamide of the PCR products (polymerase chain reaction), after enzymatic digestion, then visualized to the ultraviolet (UV) after action of the ethidium bromide. All mutations detected at the time of our survey have already been identified at patients attained by this pathology in other populations of the world. However, the important number of found mutation with regard to the one of the studied patients testifies that the origin of this big clinical variability that characterizes the illness in the consequences of an enormous diversity of molecular defects of the CFTR gene.

Keywords: cystic fibrosis, CFTR gene, polymorphism, algerian population, sweat test, genotypical diagnosis

Procedia PDF Downloads 308
644 Mapping of Forest Cover Change in the Democratic Republic of the Congo

Authors: Armand Okende, Benjamin Beaumont

Abstract:

Introduction: Deforestation is a change in the structure and composition of flora and fauna, which leads to a loss of biodiversity, production of goods and services and an increase in fires. It concerns vast territories in tropical zones particularly; this is the case of the territory of Bolobo in the current province of Maï- Ndombe in the Democratic Republic of Congo. Indeed, through this study between 2001 and 2018, we believe that it was important to show and analyze quantitatively the important forests changes and analyze quantitatively. It’s the overall objective of this study because, in this area, we are witnessing significant deforestation. Methodology: Mapping and quantification are the methodological approaches that we have put forward to assess the deforestation or forest changes through satellite images or raster layers. These satellites data from Global Forest Watch are integrated into the GIS software (GRASS GIS and Quantum GIS) to represent the loss of forest cover that has occurred and the various changes recorded (e.g., forest gain) in the territory of Bolobo. Results: The results obtained show, in terms of quantifying deforestation for the periods 2001-2006, 2007-2012 and 2013-2018, the loss of forest area in hectares each year. The different change maps produced during different study periods mentioned above show that the loss of forest areas is gradually increasing. Conclusion: With this study, knowledge of forest management and protection is a challenge to ensure good management of forest resources. To do this, it is wise to carry out more studies that would optimize the monitoring of forests to guarantee the ecological and economic functions they provide in the Congo Basin, particularly in the Democratic Republic of Congo. In addition, the cartographic approach, coupled with the geographic information system and remote sensing proposed by Global Forest Watch using raster layers, provides interesting information to explain the loss of forest areas.

Keywords: deforestation, loss year, forest change, remote sensing, drivers of deforestation

Procedia PDF Downloads 129
643 A Systems Approach to Targeting Cyclooxygenase: Genomics, Bioinformatics and Metabolomics Analysis of COX-1 -/- and COX-2-/- Lung Fibroblasts Providing Indication of Sterile Inflammation

Authors: Abul B. M. M. K. Islam, Mandar Dave, Roderick V. Jensen, Ashok R. Amin

Abstract:

A systems approach was applied to characterize differentially expressed transcripts, bioinformatics pathways, and proteins and prostaglandins (PGs) from lung fibroblasts procured from wild-type (WT), COX-1-/- and COX-2-/- mice to understand system level control mechanism. Bioinformatics analysis of COX-2 and COX-1 ablated cells induced COX-1 and COX-2 specific signature respectively, which significantly overlapped with an 'IL-1β induced inflammatory signature'. This defined novel cross-talk signals that orchestrated coordinated activation of pathways of sterile inflammation sensed by cellular stress. The overlapping signals showed significant over-representation of shared pathways for interferon y and immune responses, T cell functions, NOD, and toll-like receptor signaling. Gene Ontology Biological Process (GOBP) and pathway enrichment analysis specifically showed an increase in mRNA expression associated with: (a) organ development and homeostasis in COX-1-/- cells and (b) oxidative stress and response, spliceosomes and proteasomes activity, mTOR and p53 signaling in COX-2-/- cells. COX-1 and COX-2 showed signs of functional pathways committed to cell cycle and DNA replication at the genomics level. As compared to WT, metabolomics analysis revealed a significant increase in COX-1 mRNA and synthesis of basal levels of eicosanoids (PGE2, PGD2, TXB2, LTB4, PGF1α, and PGF2α) in COX-2 ablated cells and increase in synthesis of PGE2, and PGF1α in COX-1 null cells. There was a compensation of PGE2 and PGF1α in COX-1-/- and COX-2-/- cells. Collectively, these results support a broader, differential and collaborative regulation of both COX-1 and COX-2 pathways at the metabolic, signaling, and genomics levels in cellular homeostasis and sterile inflammation induced by cellular stress.

Keywords: cyclooxygenases, inflammation, lung fibroblasts, systemic

Procedia PDF Downloads 289
642 Characterization of Bovine SERPIN- Alpha-1 Antitrypsin (AAT)

Authors: Sharique Ahmed, Khushtar Anwar Salman

Abstract:

Alpha-1-antitrypsin (AAT) is a major plasma serine protease inhibitor (SERPIN). Hereditary AAT deficiency is one of the common diseases in some part of the world. AAT is mainly produced in the liver and functions to protect the lung against proteolytic damage (e.g., from neutrophil elastase) acting as the major inhibitor for neutrophil elastase. α (1)-Antitrypsin (AAT) deficiency is an under recognized genetic condition that affects approximately 1 in 2,000 to 1 in 5,000 individuals and predisposes to liver disease and early-onset emphysema. Not only does α-1-antitrypsin deficiency lead to disabling syndrome of pulmonary emphysema, there are other disorders too which include ANCA (antineutrophilic cytoplasmic antibody) positive Wegener's granulomatosis, diffuse bronchiectasis, necrotizing panniculitis in α-1-antitrypsin phenotype (S), idiopathic pulmonary fibrosis and steroid dependent asthma. Augmentation therapy with alpha-1 antitrypsin (AAT) from human plasma has been available for specific treatment of emphysema due to AAT deficiency. Apart from this several observations have also suggested a role for endogenous suppressors of HIV-1, alpha-1 antitrypsin (AAT) has been identified to be one of those. In view of its varied important role in humans, serum from a mammalian source was chosen for the isolation and purification. Studies were performed on the homogeneous fraction. This study suggests that the buffalo serum α-1-antritrypsin has characteristics close to ovine, dog, horse and more importantly to human α-1-antritrypsin in terms of its hydrodynamic properties such as molecular weight, carbohydrate content, etc. The similarities in the hydrodynamic properties of buffalo serum α-1-antitrypsin with other sources of mammalian α-1-antitrypsin mean that it can be further studied and be a potential source for "augmentation therapy", as well as a source of AAT replacement therapy to raise serum levels above the protective threshold. Other parameters like the amino acid sequence, the effect of denaturants, and the thermolability or thermostability of the inhibitor will be the interesting basis of future studies on buffalo serum alpha-1 antitrypsin (AAT).

Keywords: α-1-antitrypsin, augmentation therapy , hydrodynamic properties, serine protease inhibitor

Procedia PDF Downloads 484
641 Construction Strategy of Urban Public Space in Driverless Era

Authors: Yang Ye, Hongfei Qiu, Yaqi Li

Abstract:

The planning and construction of traditional cities are oriented by cars, which leads to the problems of insufficient urban public space, fragmentation, and low utilization efficiency. With the development of driverless technology, the urban structure will change from the traditional single-core grid structure to the multi-core model. In terms of traffic organization, with the release of land for traffic facilities, public space will become more continuous and integrated with traffic space. In the context of driverless technology, urban public reconstruction is characterized by modularization and high efficiency, and its planning and layout features accord with points (service facilities), lines (smart lines), surfaces (activity centers). The public space of driverless urban roads will provide diversified urban public facilities and services. The intensive urban layout makes the commercial public space realize the functions of central activities and style display, respectively, in the interior (building atrium) and the exterior (building periphery). In addition to recreation function, urban green space can also utilize underground parking space to realize efficient dispatching of shared cars. The roads inside the residential community will be integrated into the urban landscape, providing conditions for the community public activity space with changing time sequence and improving the efficiency of space utilization. The intervention of driverless technology will change the thinking of traditional urban construction and turn it into a human-oriented one. As a result, urban public space will be richer, more connected, more efficient, and the urban space justice will be optimized. By summarizing the frontier research, this paper discusses the impact of unmanned driving on cities, especially urban public space, which is beneficial for landscape architects to cope with the future development and changes of the industry and provides a reference for the related research and practice.

Keywords: driverless, urban public space, construction strategy, urban design

Procedia PDF Downloads 111
640 Method to Assessing Aspect of Sustainable Development-Walkability

Authors: Amna Ali Nasser Al-Saadi, Riken Homma, Kazuhisa Iki

Abstract:

Need to generate objective communication between researchers, Practitioners and policy makers are top concern of sustainability. Despite the fact that many places have successes in achieving some aspects of sustainable urban development, there are no scientific facts to convince policy makers in the rest of the world to apply their guides and manuals. This is because each of them was developed to fulfill the need of specific city. The question is, how to learn the lesson from each case study? And how distinguish between the potential criteria and negative one? And how quantify their effects in the future development? Walkability has been found as a solution to achieve healthy life style as well as social, environmental and economic sustainability. Moreover, it is complicated as every aspect of sustainable development. This research is stand on quantitative- comparative methodology in order to assess pedestrian oriented development. Three Analyzed Areas (AAs) were selected. One site is located in Oman in which hypotheses as motorized oriented development, while two sites are in Japan where the development is pedestrian friendly. The study used Multi-Criteria Evaluation Method (MCEM). Initially, MCEM stands on Analytic Hierarchy Process (AHP). The later was structured into main goal (walkability), objectives (functions and layout) and attributes (the urban form criteria). Secondly, the GIS were used to evaluate the attributes in multi-criteria maps. Since each criterion has different scale of measurement, all results were standardized by z-score and used to measure the co-relations among cr iteria. Different scenario was generated from each AA. After that, MCEM (AHP- OWA) based on GIS measured the walkability score and determined the priority of criteria development in the non-walker friendly environment. As results, the comparison criteria for z-score presented a measurable distinguished orientation of development. This result has been used to prove that Oman is motorized environment while Japan is walkable. Also, it defined the powerful criteria and week criteria regardless to the AA. This result has been used to generalize the priority for walkable development.

Keywords: walkability, sustainable development, multi- criteria evaluation method, gis

Procedia PDF Downloads 450
639 Bilingual Experience Influences Different Components of Cognitive Control: Evidence from fMRI Study

Authors: Xun Sun, Le Li, Ce Mo, Lei Mo, Ruiming Wang, Guosheng Ding

Abstract:

Cognitive control plays a central role in information processing, which is comprised of various components including response suppression and inhibitory control. Response suppression is considered to inhibit the irrelevant response during the cognitive process; while inhibitory control to inhibit the irrelevant stimulus in the process of cognition. Both of them undertake distinct functions for the cognitive control, so as to enhance the performances in behavior. Among numerous factors on cognitive control, bilingual experience is a substantial and indispensible factor. It has been reported that bilingual experience can influence the neural activity of cognitive control as whole. However, it still remains unknown how the neural influences specifically present on the components of cognitive control imposed by bilingualism. In order to explore the further issue, the study applied fMRI, used anti-saccade paradigm and compared the cerebral activations between high and low proficient Chinese-English bilinguals. Meanwhile, the study provided experimental evidence for the brain plasticity of language, and offered necessary bases on the interplay between language and cognitive control. The results showed that response suppression recruited the middle frontal gyrus (MFG) in low proficient Chinese-English bilinguals, but the inferior patrietal lobe in high proficient Chinese-English bilinguals. Inhibitory control engaged the superior temporal gyrus (STG) and middle temporal gyrus (MTG) in low proficient Chinese-English bilinguals, yet the right insula cortex was more active in high proficient Chinese-English bilinguals during the process. These findings illustrate insights that bilingual experience has neural influences on different components of cognitive control. Compared with low proficient bilinguals, high proficient bilinguals turn to activate advanced neural areas for the processing of cognitive control. In addition, with the acquisition and accumulation of language, language experience takes effect on the brain plasticity and changes the neural basis of cognitive control.

Keywords: bilingual experience, cognitive control, inhibition control, response suppression

Procedia PDF Downloads 480
638 Rheolaser: Light Scattering Characterization of Viscoelastic Properties of Hair Cosmetics That Are Related to Performance and Stability of the Respective Colloidal Soft Materials

Authors: Heitor Oliveira, Gabriele De-Waal, Juergen Schmenger, Lynsey Godfrey, Tibor Kovacs

Abstract:

Rheolaser MASTER™ makes use of multiple scattering of light, caused by scattering objects in a continuous medium (such as droplets and particles in colloids), to characterize the viscoelasticity of soft materials. It offers an alternative to conventional rheometers to characterize viscoelasticity of products such as hair cosmetics. Up to six simultaneous measurements at controlled temperature can be carried out simultaneously (10-15 min), and the method requires only minor sample preparation work. Conversely to conventional rheometer based methods, no mechanical stress is applied to the material during the measurements. Therefore, the properties of the exact same sample can be monitored over time, like in aging and stability studies. We determined the elastic index (EI) of water/emulsion mixtures (1 ≤ fat alcohols (FA) ≤ 5 wt%) and emulsion/gel-network mixtures (8 ≤ FA ≤ 17 wt%) and compared with the elastic/sorage mudulus (G’) for the respective samples using a TA conventional rheometer with flat plates geometry. As expected, it was found that log(EI) vs log(G’) presents a linear behavior. Moreover, log(EI) increased in a linear fashion with solids level in the entire range of compositions (1 ≤ FA ≤ 17 wt%), while rheometer measurements were limited to samples down to 4 wt% solids level. Alternatively, a concentric cilinder geometry would be required for more diluted samples (FA > 4 wt%) and rheometer results from different sample holder geometries are not comparable. The plot of the rheolaser output parameters solid-liquid balance (SLB) vs EI were suitable to monitor product aging processes. These data could quantitatively describe some observations such as formation of lumps over aging time. Moreover, this method allowed to identify that the different specifications of a key raw material (RM < 0.4 wt%) in the respective gel-network (GN) product has minor impact on product viscoelastic properties and it is not consumer perceivable after a short aging time. Broadening of a RM spec range typically has a positive impact on cost savings. Last but not least, the photon path length (λ*)—proportional to droplet size and inversely proportional to volume fraction of scattering objects, accordingly to the Mie theory—and the EI were suitable to characterize product destabilization processes (e.g., coalescence and creaming) and to predict product stability about eight times faster than our standard methods. Using these parameters we could successfully identify formulation and process parameters that resulted in unstable products. In conclusion, Rheolaser allows quick and reliable characterization of viscoelastic properties of hair cosmetics that are related to their performance and stability. It operates in a broad range of product compositions and has applications spanning from the formulation of our hair cosmetics to fast release criteria in our production sites. Last but not least, this powerful tool has positive impact on R&D development time—faster delivery of new products to the market—and consequently on cost savings.

Keywords: colloids, hair cosmetics, light scattering, performance and stability, soft materials, viscoelastic properties

Procedia PDF Downloads 170
637 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights

Authors: Olga Kokoulina

Abstract:

Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.

Keywords: algorithms, public interest, trade secrets, transparency

Procedia PDF Downloads 122