Search results for: field optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11245

Search results for: field optimization

1255 Effects of a Head Mounted Display Adaptation on Reaching Behaviour: Implications for a Therapeutic Approach in Unilateral Neglect

Authors: Taku Numao, Kazu Amimoto, Tomoko Shimada, Kyohei Ichikawa

Abstract:

Background: Unilateral spatial neglect (USN) is a common syndrome following damage to one hemisphere of the brain (usually the right side), in which a patient fails to report or respond to stimulation from the contralesional side. These symptoms are not due to primary sensory or motor deficits, but instead, reflect an inability to process input from that side of their environment. Prism adaptation (PA) is a therapeutic treatment for USN, wherein a patient’s visual field is artificially shifted laterally, resulting in a sensory-motor adaptation. However, patients with USN also tend to perceive a left-leaning subjective vertical in the frontal plane. The traditional PA cannot be used to correct a tilt in the subjective vertical, because a prism can only polarize, not twist, the surroundings. However, this can be accomplished using a head mounted display (HMD) and a web-camera. Therefore, this study investigated whether an HMD system could be used to correct the spatial perception of USN patients in the frontal as well as the horizontal plane. We recruited healthy subjects in order to collect data for the refinement of USN patient therapy. Methods: Eight healthy subjects sat on a chair wearing a HMD (Oculus rift DK2), with a web-camera (Ovrvision) displaying a 10 degree leftward rotation and a 10 degree counter-clockwise rotation along the frontal plane. Subjects attempted to point a finger at one of four targets, assigned randomly, a total of 48 times. Before and after the intervention, each subject’s body-centre judgment (BCJ) was tested by asking them to point a finger at a touch panel straight in front of their xiphisternum, 10 times sight unseen. Results: Intervention caused the location pointed to during the BCJ to shift 35 ± 17 mm (Ave ± SD) leftward in the horizontal plane, and 46 ± 29 mm downward in the frontal plane. The results in both planes were significant by paired-t-test (p<.01). Conclusions: The results in the horizontal plane are consistent with those observed following PA. Furthermore, the HMD and web-camera were able to elicit 3D effects, including in both the horizontal and frontal planes. Future work will focus on applying this method to patients with and without USN, and investigating whether subject posture is also affected by the HMD system.

Keywords: head mounted display, posture, prism adaptation, unilateral spatial neglect

Procedia PDF Downloads 278
1254 Mediterranean Diet-Driven Changes in Gut Microbiota Decrease the Infiltration of Inflammatory Myeloid Cells into the Intestinal Tissue

Authors: Gema Gómez-Casado, Alba Rodríguez-Muñoz, Virginia Mela-Rivas, Pallavi Kompella, Francisco José Tinahones-Madueña, Isabel Moreno-Indias, Almudena Ortega-Gómez

Abstract:

Obesity is a high-priority health problem worldwide due to its high prevalence. The proportion of obese and overweight subjects in industrialized countries exceeds half of the population in most cases. Beyond the metabolic problem, obesity boosts inflammation levels in the organism. The gut microbiota, considered an organ by itself, controls a high variety of processes at a systemic level. In fact, the microbiota interacts closely with the immune system, being crucial in determining the maturation state of neutrophils, key effectors of the innate immune response. It is known that changes in the diet exert strong effects on the variety and activity of the gut microbiota. The effect that those changes have on the axis microbiota-immune response is an unexplored field. In this study, 10 patients with obesity (weight 114,3 ± 14,5Kg, BMI 40,47±3,66) followed a Mediterranean-hypocaloric diet for 3 months, reducing their initial weight by 12,71 ± 3%. A transplant of microbiota from these patients before and after the diet was performed into wild type “germ-free” mice (n=10/group), treated with antibiotics. Six weeks after the transplant, mice were euthanized, and the presence of cells from the innate immune system were analysed in different organs (bone marrow, blood, spleen, visceral adipose tissue, and intestine) by flow cytometry. No differences were observed in the number of myeloid cells in bone marrow, blood, spleen, or visceral adipose tissue of mice transplanted with patient’s microbiota before and after following the Mediterranean diet. However, the intestine of mice that received post-diet microbiota presented a marked decrease in the number of neutrophils (whose presence is associated with tissue inflammation), as well as macrophages. In line with these findings, intestine monocytes from mice with post-diet microbiota showed a less inflammatory profile (lower Ly6Gˡᵒʷ proportion of cells). These results point toward a decrease in the inflammatory state of the intestinal tissue, derived from changes in the gut microbiota, which occurred after a 3-month Mediterranean diet.

Keywords: obesity, nutrition, Mediterranean diet, gut microbiota, immune system

Procedia PDF Downloads 125
1253 Advanced Compound Coating for Delaying Corrosion of Fast-Dissolving Alloy in High Temperature and Corrosive Environment

Authors: Lei Zhao, Yi Song, Tim Dunne, Jiaxiang (Jason) Ren, Wenhan Yue, Lei Yang, Li Wen, Yu Liu

Abstract:

Fasting dissolving magnesium (DM) alloy technology has contributed significantly to the “Shale Revolution” in oil and gas industry. This application requires DM downhole tools dissolving initially at a slow rate, rapidly accelerating to a high rate after certain period of operation time (typically 8 h to 2 days), a contradicting requirement that can hardly be addressed by traditional Mg alloying or processing itself. Premature disintegration has been broadly reported in downhole DM tool from field trials. To address this issue, “temporary” thin polymers of various formulations are currently coated onto DM surface to delay its initial dissolving. Due to conveying parts, harsh downhole condition, and high dissolving rate of the base material, the current delay coatings relying on pure polymers are found to perform well only at low temperature (typical < 100 ℃) and parts without sharp edges or corners, as severe geometries prevent high quality thin film coatings from forming effectively. In this study, a coating technology combining Plasma Electrolytic Oxide (PEO) coatings with advanced thin film deposition has been developed, which can delay DM complex parts (with sharp corners) in corrosive fluid at 150 ℃ for over 2 days. Synergistic effects between porous hard PEO coating and chemical inert elastic-polymer sealing leads to its delaying dissolution improvement, and strong chemical/physical bonding between these two layers has been found to play essential role. Microstructure of this advanced coating and compatibility between PEO and various polymer selections has been thoroughly investigated and a model is also proposed to explain its delaying performance. This study could not only benefit oil and gas industry to unplug their High Temperature High Pressure (HTHP) unconventional resources inaccessible before, but also potentially provides a technical route for other industries (e.g., bio-medical, automobile, aerospace) where primer anti-corrosive protection on light Mg alloy is highly demanded.

Keywords: dissolvable magnesium, coating, plasma electrolytic oxide, sealer

Procedia PDF Downloads 110
1252 The Influence of Modernity and Globalization upon Language: The Korean Language between Confucianism and Americanization

Authors: Raluca-Ioana Antonescu

Abstract:

The field research of the paper stands at the intersection between Linguistics and Sociology, while the problem of the research is the importance of language in the modernization process and in a globalized society. The research objective is to prove that language is a stimulant for modernity, while it defines the tradition and the culture of a specific society. In order to examine the linguistic change of the Korean language due to the modernity and globalization, the paper tries to answer one main question, What are the changes the Korean language underwent from a traditional version of Korean, towards one influenced by modernity?, and two secondary questions, How are explored in specialized literature the relations between globalization (and modernity) and culture (focusing on language)? and What influences the Korean language? For the purpose of answering the research questions, the paper has the main premise that due to modernity and globalization, the Korean language changed its discourse construction, and two secondary hypothesis, first is that in literature there are not much explored the relations between culture and modernity focusing on the language discourse construction, but more about identity issue and commodification problems, and the second hypothesis is that the Korean language is influenced by traditional values (like Confucianism) while receiving influence also of globalization process (especially from English language). In terms of methodology, the paper will analyze the two main influences upon the Korean language, referring to traditionalism (being defined as the influence of Confucianism) and modernism (as the influence of other countries’ language and culture), and how the Korean language it was constructed and modified due to these two elements. The paper will analyze at what level (grammatical, lexical, etc.) the traditionalism help at the construction of the Korean language, and what are the changes at each level that modernism brought along. As for the results of this research, the influence of modernism changed both lexically and grammatically the Korean language. In 60 years the increase of English influence is astonishing, and this paper shows the main changes the Korean language underwent, like the loanwords (Konglish), but also the reduction of the speech levels and the ease of the register variation use. Therefore the grammatical influence of modernity and globalization could be seen at the reduction of the speech level and register variation, while the lexical change comes with the influence of English language especially, where about 10% of the Korean vocabulary is considered to be loanwords. Also the paper presents the interrelation between traditionalism and modernity, with the example of Konglish, but not only (we can consider also the Korean greetings which are translated by Koreans when they speak in other languages, bringing their cultural characteristics in English discourse construction), which makes the Koreans global, since they speak in an international language, but still local since they cannot get rid completely of their culture.

Keywords: Confucianism, globalization, language and linguistic change, modernism, traditionalism

Procedia PDF Downloads 201
1251 Cartography through Picasso’s Eyes

Authors: Desiree Di Marco

Abstract:

The aim of this work is to show through the lens of art first which kind of reality was the one represented through fascist maps, and second to study the impact of the fascist regime’s cartography (FRC) on observers eye’s. In this study, it is assumed that the FRC’s representation of reality was simplified, timeless, and even a-spatial because it underrates the concept of territoriality. Cubism and Picasso’s paintings will be used as counter-examples to mystify fascist cartography’s ideological assumptions. The difference between the gaze of an observer looking at the surface of a fascist map and the gaze of someone observing a Picasso painting is impressive. Because there is always something dark, hidden, behind and inside a map, the world of fascist maps was a world built starting from the observation of a “window” that distorted reality and trapped the eyes of the observers. Moving across the map, they seem as if they were hypnotized. Cartohypnosis is the state in which the observer finds himself enslaved by the attractive force of the map, which uses a sort of “magic” geography, a geography that, by means of symbolic language, never has as its primary objective the attempt to show us reality in its complexity, but that of performing for its audience. Magical geography and hypnotic cartography in fascism blended together, creating an almost mystical, magical relationship that demystified reality to reduce the world to a conquerable space. This reduction offered the observer the possibility of conceiving new dimensions: of the limit, of the boundary, elements with which the subject felt fully involved and in which the aesthetic force of the images demonstrated all its strength. But in the early 20th century, the combination of art and cartography gave rise to new possibilities. Cubism which, more than all the other artistic currents showed us how much the observation of reality from a single point of view falls within dangerous logic, is an example. Cubism was an artistic movement that brought about a profound transformation in pictorial culture. It was not only a revolution of pictorial space, but it was a revolution of our conception of pictorial space. Up until that time, men and women were more inclined to believe in the power of images and their representations. Cubist painters rebelled against this blindness by claiming that art must always offer an alternative. Indeed the contribution of this work is precisely to show how art can be able to provide alternatives to even the most horrible regimes and the most atrocious human misfortunes. It also enriches the field of cartography because it "reassures" it by showing how much good it can be for cartography if also for other disciplines come close. Only in this way researcher can increase the chances for the cartography of a greater diffusion at the academic level.

Keywords: cartography, Picasso, fascism, culture

Procedia PDF Downloads 64
1250 Avoidance and Selectivity in the Acquisition of Arabic as a Second/Foreign Language

Authors: Abeer Heider

Abstract:

This paper explores and classifies the different kinds of avoidances that students commonly make in the acquisition of Arabic as a second/foreign language, and suggests specific strategies to help students lessen their avoidance trends in hopes of streamlining the learning process. Students most commonly use avoidance strategies in grammar, and word choice. These different types of strategies have different implications and naturally require different approaches. Thus the question remains as to the most effective way to help students improve their Arabic, and how teachers can efficiently utilize these techniques. It is hoped that this research will contribute to understand the role of avoidance in the field of the second language acquisition in general, and as a type of input. Yet some researchers also note that similarity between L1 and L2 may be problematic as well since the learner may doubt that such similarity indeed exists and consequently avoid the identical constructions or elements (Jordens, 1977; Kellermann, 1977, 1978, 1986). In an effort to resolve this issue, a case study is being conducted. The present case study attempts to provide a broader analysis of what is acquired than is usually the case, analyzing the learners ‘accomplishments in terms of three –part framework of the components of communicative competence suggested by Michele Canale: grammatical competence, sociolinguistic competence and discourse competence. The subjects of this study are 15 students’ 22th year who came to study Arabic at Qatar University of Cairo. The 15 students are in the advanced level. They were complete intermediate level in Arabic when they arrive in Qatar for the first time. The study used discourse analytic method to examine how the first language affects students’ production and output in the second language, and how and when students use avoidance methods in their learning. The study will be conducted through Fall 2015 through analyzing audio recordings that are recorded throughout the entire semester. The recordings will be around 30 clips. The students are using supplementary listening and speaking materials. The group will be tested at the end of the term to assess any measurable difference between the techniques. Questionnaires will be administered to teachers and students before and after the semester to assess any change in attitude toward avoidance and selectivity methods. Responses to these questionnaires are analyzed and discussed to assess the relative merits of the aforementioned strategies to avoidance and selectivity to further support on. Implications and recommendations for teacher training are proposed.

Keywords: the second language acquisition, learning languages, selectivity, avoidance

Procedia PDF Downloads 276
1249 Structural Property and Mechanical Behavior of Polypropylene–Elemental Sulfur (S8) Composites: Effect of Sulfur Loading

Authors: S. Vijay Kumar, Kishore K. Jena, Saeed M. Alhassan

Abstract:

Elemental sulfur is currently produced on the level of 70 million tons annually by petroleum refining, majority of which is used in the production of sulfuric acid, fertilizer and other chemicals. Still, over 6 million tons of elemental sulfur is generated in excess, which creates exciting opportunities to develop new chemistry to utilize sulfur as a feedstock for polymers. Development of new polymer composite materials using sulfur is not widely explored and remains an important challenge in the field. Polymer nanocomposites prepared by carbon nanotube, graphene, silica and other nanomaterials were well established. However, utilization of sulfur as filler in the polymer matrix could be an interesting study. This work is to presents the possibility of utilizing elemental sulfur as reinforcing fillers in the polymer matrix. In this study we attempted to prepare polypropylene/sulfur nanocomposite. The physical, mechanical and morphological properties of the newly developed composites were studied according to the sulfur loading. In the sample preparation, four levels of elemental sulfur loading (5, 10, 20 and 30 wt. %) were designed. Composites were prepared by the melt mixing process by using laboratory scale mini twin screw extruder at 180°C for 15 min. The reaction time and temperature were maintained constant for all prepared composites. The structure and crystallization behavior of composites was investigated by Raman, FTIR, XRD and DSC analysis. It was observed that sulfur interfere with the crystalline arrangement of polypropylene and depresses the crystallization, which affects the melting point, mechanical and thermal stability. In the tensile test, one level of test temperature (room temperature) and crosshead speed (10 mm/min) was designed. Tensile strengths and tensile modulus of the composites were slightly decreased with increasing in filler loading, however, percentage of elongation improved by more than 350% compared to neat polypropylene. The effect of sulfur on the morphology of polypropylene was studied with TEM and SEM techniques. Microscope analysis revels that sulfur is homogeneously dispersed in polymer matrix and behaves as single phase arrangement in the polymer. The maximum elongation for the polypropylene can be achieved by adjusting the sulfur loading in the polymer. This study reviles the possibility of using elemental sulfur as a solid plasticizer in the polypropylene matrix.

Keywords: crystallization, elemental sulfur, morphology, thermo-mechanical properties, polypropylene, polymer nanocomposites

Procedia PDF Downloads 343
1248 An Exploratory Study to Appraise the Current Challenges and Limitations Faced in Applying and Integrating the Historic Building Information Modelling Concept for the Management of Historic Buildings

Authors: Oluwatosin Adewale

Abstract:

The sustainability of built heritage has become a relevant issue in recent years due to the social and economic values associated with these buildings. Heritage buildings provide a means for human perception of culture and represent a legacy of long-existing history; they define the local character of the social world and provide a vital connection to the past with their associated aesthetical and communal benefits. The identified values of heritage buildings have increased the importance of conservation and the lifecycle management of these buildings. The recent developments of digital design technology in engineering and the built environment have led to the adoption of Building Information Modelling (BIM) by the Architecture, Engineering, Construction, and Operations (AECO) industry. BIM provides a platform for the lifecycle management of a construction project through effective collaboration among stakeholders and the analysis of a digital information model. This growth in digital design technology has also made its way into the field of architectural heritage management in the form of Historic Building Information Modelling (HBIM). A reverse engineering process for digital documentation of heritage assets that draws upon similar information management processes as the BIM process. However, despite the several scientific and technical contributions made to the development of the HBIM process, it doesn't remain easy to integrate at the most practical level of heritage asset management. The main objective identified under the scope of the study is to review the limitations and challenges faced by heritage management professionals in adopting an HBIM-based asset management procedure for historic building projects. This paper uses an exploratory study in the form of semi-structured interviews to investigate the research problem. A purposive sample of heritage industry experts and professionals were selected to take part in a semi-structured interview to appraise some of the limitations and challenges they have faced with the integration of HBIM into their project workflows. The findings from this study will present the challenges and limitations faced in applying and integrating the HBIM concept for the management of historic buildings.

Keywords: building information modelling, built heritage, heritage asset management, historic building information modelling, lifecycle management

Procedia PDF Downloads 95
1247 A Comparison of Antibiotic Resistant Enterobacteriaceae: Diabetic versus Non-Diabetic Infections

Authors: Zainab Dashti, Leila Vali

Abstract:

Background: The Middle East, in particular Kuwait, contains one of the highest rates of patients with Diabetes in the world. Generally, infections resistant to antibiotics among the diabetic population has been shown to be on the rise. This is the first study in Kuwait to compare the antibiotic resistance profiles and genotypic differences between the resistant isolates of Enterobacteriaceae obtained from diabetic and non-diabetic patients. Material/Methods: In total, 65 isolates were collected from diabetic patients consisting of 34 E. coli, 15 K. pneumoniae and 16 other Enterobacteriaceae species (including Salmonella spp. Serratia spp and Proteus spp.). In our control group, a total of 49 isolates consisting of 37 E. coli, 7 K. pneumoniae and 5 other species (including Salmonella spp. Serratia spp and Proteus spp.) were included. Isolates were identified at the species level and antibiotic resistance profiles, including Colistin, were determined using initially the Vitek system followed by double dilution MIC and E-test assays. Multi drug resistance (MDR) was defined as isolates resistant to a minimum of three antibiotics from three different classes. PCR was performed to detect ESBL genes (blaCTX-M, blaTEM & blaSHV), flouroquinolone resistance genes (qnrA, qnrB, qnrS & aac(6’)-lb-cr) and carbapenem resistance genes (blaOXA, blaVIM, blaGIM, blaKPC, blaIMP, & blaNDM) in both groups. Pulse field gel electrophoresis (PFGE) was performed to compare clonal relatedness of both E. coli and K.pneumonaie isolates. Results: Colistin resistance was determined in three isolates with MICs of 32-128 mg/L. A significant difference in resistance to ampicillin (Diabetes 93.8% vs control 72.5%, P value <0.002), augmentin (80% vs 52.5%, p value < 0.003), cefuroxime (69.2% vs 45%, p value < 0.0014), ceftazadime (73.8% vs 42.5%, p value <0.001) and ciprofloxacin (67.6% vs 40%, p value < 0.005) were determined. Also, a significant difference in MDR rates between the two groups (Diabetes 76.9%, control 57.5%, p value <0.036 were found. All antibiotic resistance genes showed a higher prevalence among the diabetic group, except for blaCTX-M, which was higher among the control group. PFGE showed a high rate of diversity between each group of isolates. Conclusions: Our results suggested an alarming rate of antibiotic resistance, in particular Colistin resistance (1.8%) among K. pneumoniea isolated from diabetic patients in Kuwait. MDR among Enterobacteriaceae infections also seems to be a worrying issue among the diabetics of Kuwait. More efforts are required to limit the issue of antibiotic resistance in Kuwait, especially among patients with diabetes mellitus.

Keywords: antibiotic resistance, diabetes, enterobacreriacae, multi antibiotic resistance

Procedia PDF Downloads 360
1246 Evaluation of Australian Open Banking Regulation: Balancing Customer Data Privacy and Innovation

Authors: Suman Podder

Abstract:

As Australian ‘Open Banking’ allows customers to share their financial data with accredited Third-Party Providers (‘TPPs’), it is necessary to evaluate whether the regulators have achieved the balance between protecting customer data privacy and promoting data-related innovation. Recognising the need to increase customers’ influence on their own data, and the benefits of data-related innovation, the Australian Government introduced ‘Consumer Data Right’ (‘CDR’) to the banking sector through Open Banking regulation. Under Open Banking, TPPs can access customers’ banking data that allows the TPPs to tailor their products and services to meet customer needs at a more competitive price. This facilitated access and use of customer data will promote innovation by providing opportunities for new products and business models to emerge and grow. However, the success of Open Banking depends on the willingness of the customers to share their data, so the regulators have augmented the protection of data by introducing new privacy safeguards to instill confidence and trust in the system. The dilemma in policymaking is that, on the one hand, lenient data privacy laws will help the flow of information, but at the risk of individuals’ loss of privacy, on the other hand, stringent laws that adequately protect privacy may dissuade innovation. Using theoretical and doctrinal methods, this paper examines whether the privacy safeguards under Open Banking will add to the compliance burden of the participating financial institutions, resulting in the undesirable effect of stifling other policy objectives such as innovation. The contribution of this research is three-fold. In the emerging field of customer data sharing, this research is one of the few academic studies on the objectives and impact of Open Banking in the Australian context. Additionally, Open Banking is still in the early stages of implementation, so this research traces the evolution of Open Banking through policy debates regarding the desirability of customer data-sharing. Finally, the research focuses not only on the customers’ data privacy and juxtaposes it with another important objective of promoting innovation, but it also highlights the critical issues facing the data-sharing regime. This paper argues that while it is challenging to develop a regulatory framework for protecting data privacy without impeding innovation and jeopardising yet unknown opportunities, data privacy and innovation promote different aspects of customer welfare. This paper concludes that if a regulation is appropriately designed and implemented, the benefits of data-sharing will outweigh the cost of compliance with the CDR.

Keywords: consumer data right, innovation, open banking, privacy safeguards

Procedia PDF Downloads 139
1245 Airborne CO₂ Lidar Measurements for Atmospheric Carbon and Transport: America (ACT-America) Project and Active Sensing of CO₂ Emissions over Nights, Days, and Seasons 2017-2018 Field Campaigns

Authors: Joel F. Campbell, Bing Lin, Michael Obland, Susan Kooi, Tai-Fang Fan, Byron Meadows, Edward Browell, Wayne Erxleben, Doug McGregor, Jeremy Dobler, Sandip Pal, Christopher O'Dell, Ken Davis

Abstract:

The Active Sensing of CO₂ Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) is a NASA Langley Research Center instrument funded by NASA’s Science Mission Directorate that seeks to advance technologies critical to measuring atmospheric column carbon dioxide (CO₂ ) mixing ratios in support of the NASA ASCENDS mission. The ACES instrument, an Intensity-Modulated Continuous-Wave (IM-CW) lidar, was designed for high-altitude aircraft operations and can be directly applied to space instrumentation to meet the ASCENDS mission requirements. The ACES design demonstrates advanced technologies critical for developing an airborne simulator and spaceborne instrument with lower platform consumption of size, mass, and power, and with improved performance. The Atmospheric Carbon and Transport – America (ACT-America) is an Earth Venture Suborbital -2 (EVS-2) mission sponsored by the Earth Science Division of NASA’s Science Mission Directorate. A major objective is to enhance knowledge of the sources/sinks and transport of atmospheric CO₂ through the application of remote and in situ airborne measurements of CO₂ and other atmospheric properties on spatial and temporal scales. ACT-America consists of five campaigns to measure regional carbon and evaluate transport under various meteorological conditions in three regional areas of the Continental United States. Regional CO₂ distributions of the lower atmosphere were observed from the C-130 aircraft by the Harris Corp. Multi-Frequency Fiber Laser Lidar (MFLL) and the ACES lidar. The airborne lidars provide unique data that complement the more traditional in situ sensors. This presentation shows the applications of CO₂ lidars in support of these science needs.

Keywords: CO₂ measurement, IMCW, CW lidar, laser spectroscopy

Procedia PDF Downloads 160
1244 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 313
1243 The Levels of Neurosteroid 7β-Hydroxy-Epiandrosterone in Men and Pregnant Women

Authors: J. Vitku, L. Kolatorova, T. Chlupacova, J. Heracek, M. Hill, M. Duskova, L. Starka

Abstract:

Background: 7β-hydroxy-epiandrosterone (7β–OH-EpiA) is an endogenous steroid, that has been shown to exert neuroprotective and anti-inflammatory effects in vitro as well as in animal models. However, to the best of our knowledge no information is available about concentration of this androgen metabolite in human population. The aim of the study was to measure and compare levels of 7β–OH-EpiA in men and pregnant women in different biological fluids and evaluate the relationship between 7β–OH-EpiA in men and their sperm quality. Methods: First, a sensitive isotope dilution high performance liquid chromatography-mass spectrometry method for measurement of 7β–OH-EpiA in different biological fluids was developed. Validation of the method met the requirements of FDA guidelines. Afterwards 7β–OH-EpiA in plasma and seminal plasma of 191 men with different degree of infertility (healthy men, lightly infertile men, moderately infertile men, severely infertile men) was analysed. Furthermore, the levels of 7β–OH-EpiA in plasma of 34 pregnant women in 37th week of gestation and corresponding cord plasma that reflects steroid levels in the fetus were measured. Results: Concentrations of 7β–OH-EpiA in seminal plasma were significantly higher in severely infertile men in comparison with healthy men and lightly infertile men. The same trend was observed when blood plasma was evaluated. Furthermore, plasmatic 7β –OH-EpiA negatively correlated with concentration (-0.215; p < 0.01) and total count (-0.15; p < 0.05). Seminal 7β–OH-EpiA was negatively associated with motility (-0.26; p < 0.01), progressively motile sperms (-0.233; p < 0.01) and nonprogressively motile sperms (-0.188; p < 0.05). Plasmatic 7β –OH-EpiA levels in men were generally higher in comparison with pregnant women. Levels 7β–OH-EpiA were under the lower limit of quantification (LLOQ) in majority of samples of pregnant women and cord plasma. Only 4 plasma samples of pregnant women and 7 cord blood plasma samples were above LLOQ and where in range of units of pg/ml. Conclusion: Based on available information, this is the first study measuring 7β–OH-EpiA in human samples. 7β–OH-EpiA is associated with lower sperm quality and certainly it is worth to explore its role in this field thoroughly. Interestingly, levels of 7β–OH-EpiA in pregnant women were extremely low despite the fact that steroid levels including androgens are generally higher during pregnancy. Acknowledgements: This work was supported by the project MH CR 17-30528 A from the Czech Health Research Council, MH CZ - DRO (Institute of Endocrinology - EU, 00023761) and by the MEYS CR (OP RDE, Excellent research - ENDO.CZ).

Keywords: 7β-hydroxy-epiandrosterone, steroid, sperm quality, pregnancy

Procedia PDF Downloads 254
1242 Advancements in AI Training and Education for a Future-Ready Healthcare System

Authors: Shamie Kumar

Abstract:

Background: Radiologists and radiographers (RR) need to educate themselves and their colleagues to ensure that AI is integrated safely, useful, and in a meaningful way with the direction it always benefits the patients. AI education and training are fundamental to the way RR work and interact with it, such that they feel confident using it as part of their clinical practice in a way they understand it. Methodology: This exploratory research will outline the current educational and training gaps for radiographers and radiologists in AI radiology diagnostics. It will review the status, skills, challenges of educating and teaching. Understanding the use of artificial intelligence within daily clinical practice, why it is fundamental, and justification on why learning about AI is essential for wider adoption. Results: The current knowledge among RR is very sparse, country dependent, and with radiologists being the majority of the end-users for AI, their targeted training and learning AI opportunities surpass the ones available to radiographers. There are many papers that suggest there is a lack of knowledge, understanding, and training of AI in radiology amongst RR, and because of this, they are unable to comprehend exactly how AI works, integrates, benefits of using it, and its limitations. There is an indication they wish to receive specific training; however, both professions need to actively engage in learning about it and develop the skills that enable them to effectively use it. There is expected variability amongst the profession on their degree of commitment to AI as most don’t understand its value; this only adds to the need to train and educate RR. Currently, there is little AI teaching in either undergraduate or postgraduate study programs, and it is not readily available. In addition to this, there are other training programs, courses, workshops, and seminars available; most of these are short and one session rather than a continuation of learning which cover a basic understanding of AI and peripheral topics such as ethics, legal, and potential of AI. There appears to be an obvious gap between the content of what the training program offers and what the RR needs and wants to learn. Due to this, there is a risk of ineffective learning outcomes and attendees feeling a lack of clarity and depth of understanding of the practicality of using AI in a clinical environment. Conclusion: Education, training, and courses need to have defined learning outcomes with relevant concepts, ensuring theory and practice are taught as a continuation of the learning process based on use cases specific to a clinical working environment. Undergraduate and postgraduate courses should be developed robustly, ensuring the delivery of it is with expertise within that field; in addition, training and other programs should be delivered as a way of continued professional development and aligned with accredited institutions for a degree of quality assurance.

Keywords: artificial intelligence, training, radiology, education, learning

Procedia PDF Downloads 85
1241 Engineering Analysis for Fire Safety Using Computational Fluid Dynamic (CFD)

Authors: Munirajulu M, Srikanth Modem

Abstract:

A large cricket stadium with the capacity to accommodate several thousands of spectators has the seating arena consisting of a two-tier arrangement with an upper and a lower bowl and an intermediate concourse podium level for pedestrian movement to access the bowls. The uniqueness of the stadium is that spectators can have an unobstructed view from all around the podium towards the field of play. Upper and lower bowls are connected by stairs. The stairs landing is a precast slab supported by cantilevered steel beams. These steel beams are fixed to precast columns supporting the stadium structure. The stair slabs are precast concrete supported on a landing slab and cantilevered steel beams. During an event of a fire at podium level between two staircases, fire resistance of steel beams is very critical to life safety. If the steel beam loses its strength due to lack of fire resistance, it will be weak in supporting stair slabs and may lead to a hazard in evacuating occupants from the upper bowl to the lower bowl. In this study, to ascertain fire rating and life safety, a performance-based design using CFD analysis is used to evaluate the steel beams' fire resistance. A fire size of 3.5 MW (convective heat output of fire) with a wind speed of 2.57 m/s is considered for fire and smoke simulation. CFD results show that the smoke temperature near the staircase/ around the staircase does not exceed 1500 C for the fire duration considered. The surface temperature of cantilevered steel beams is found to be less than or equal to 1500 C. Since this temperature is much less than the critical failure temperature of steel (5200 C), it is concluded that the design of structural steel supports on the staircase is adequate and does not need additional fire protection such as fire-resistant coating. CFD analysis provided an engineering basis for the performance-based design of steel structural elements and an opportunity to optimize fire protection requirements. Thus, performance-based design using CFD modeling and simulation of fire and smoke is an innovative way to evaluate fire rating requirements, ascertain life safety and optimize the design with regard to fire protection on structural steel elements.

Keywords: fire resistance, life safety, performance-based design, CFD analysis

Procedia PDF Downloads 190
1240 How Technology Can Help Teachers in Reflective Practice

Authors: Ambika Perisamy, Asyriawati binte Mohd Hamzah

Abstract:

The focus of this presentation is to discuss teacher professional development (TPD) through the use of technology. TPD is necessary to prepare teachers for future challenges they will face throughout their careers and to develop new skills and good teaching practices. We will also be discussing current issues in embracing technology in the field of early childhood education and the impact on the professional development of teachers. Participants will also learn to apply teaching and learning practices through the use of technology. One major objective of this presentation is to coherently fuse practical, technology and theoretical content. The process begins by concretizing a set of preconceived ideas which need to be joined with theoretical justifications found in the literature. Technology can make observations fairer and more reliable, easier to implement, and more preferable to teachers and principals. Technology will also help principals to improve classroom observations of teachers and ultimately improve teachers’ continuous professional development. Video technology allows the early childhood teachers to record and keep the recorded video for reflection at any time. This will also provide opportunities for her to share with her principals for professional dialogues and continuous professional development plans. A total of 10 early childhood teachers and 4 principals were involved in these efforts which identified and analyze the gaps in the quality of classroom observations and its co relation to developing teachers as reflective practitioners. The methodology used involves active exploration with video technology recordings, conversations, interviews and authentic teacher child interactions which forms the key thrust in improving teaching and learning practice. A qualitative analysis of photographs, videos, transcripts which illustrates teacher’s reflections and classroom observation checklists before and after the use of video technology were adopted. Arguably, although PD support can be magnanimously strong, if teachers could not connect or create meaning out of the opportunities made available to them, they may remain passive or uninvolved. Therefore, teachers must see the value of applying new ideas such as technology and approaches to practice while creating personal meaning out of professional development. These video recordings are transferable, can be shared and edited through social media, emails and common storage between teachers and principals. To conclude the importance of reflective practice among early childhood teachers and addressing the concerns raised before and after the use of video technology, teachers and principals shared the feasibility, practical and relevance use of video technology.

Keywords: early childhood education, reflective, improve teaching and learning, technology

Procedia PDF Downloads 499
1239 GIS Technology for Environmentally Polluted Sites with Innovative Process to Improve the Quality and Assesses the Environmental Impact Assessment (EIA)

Authors: Hamad Almebayedh, Chuxia Lin, Yu wang

Abstract:

The environmental impact assessment (EIA) must be improved, assessed, and quality checked for human and environmental health and safety. Soil contamination is expanding, and sites and soil remediation activities proceeding around the word which simplifies the answer “quality soil characterization” will lead to “quality EIA” to illuminate the contamination level and extent and reveal the unknown for the way forward to remediate, countifying, containing, minimizing and eliminating the environmental damage. Spatial interpolation methods play a significant role in decision making, planning remediation strategies, environmental management, and risk assessment, as it provides essential elements towards site characterization, which need to be informed into the EIA. The Innovative 3D soil mapping and soil characterization technology presented in this research paper reveal the unknown information and the extent of the contaminated soil in specific and enhance soil characterization information in general which will be reflected in improving the information provided in developing the EIA related to specific sites. The foremost aims of this research paper are to present novel 3D mapping technology to quality and cost-effectively characterize and estimate the distribution of key soil characteristics in contaminated sites and develop Innovative process/procedure “assessment measures” for EIA quality and assessment. The contaminated site and field investigation was conducted by innovative 3D mapping technology to characterize the composition of petroleum hydrocarbons contaminated soils in a decommissioned oilfield waste pit in Kuwait. The results show the depth and extent of the contamination, which has been interred into a developed assessment process and procedure for the EIA quality review checklist to enhance the EIA and drive remediation and risk assessment strategies. We have concluded that to minimize the possible adverse environmental impacts on the investigated site in Kuwait, the soil-capping approach may be sufficient and may represent a cost-effective management option as the environmental risk from the contaminated soils is considered to be relatively low. This research paper adopts a multi-method approach involving reviewing the existing literature related to the research area, case studies, and computer simulation.

Keywords: quality EIA, spatial interpolation, soil characterization, contaminated site

Procedia PDF Downloads 86
1238 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers

Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang

Abstract:

Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.

Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors

Procedia PDF Downloads 120
1237 Capitalizing 'Ba' in a Knowledge Creation among Medical Researchers in Malaysian Higher Education Institution

Authors: Connie Edang, Siti Arpah Noordin, Shamila Mohamed Shuhidan

Abstract:

For the past few decades, there are growing numbers of knowledge based industries in Malaysia. As competitive edge has become so important nowadays, the consideration of research and development (R&D) should be put at the highest priority. Alike other industries, HEIs are also contributors to the nation’s development and wealth. Hence, to become a hub for creating a knowledge-based society, HEIs not only responsible for producing skillful human capital, but also to get involved in R&D. With the importance of R&D in today’s modern economy and the rise of Science and Technology, it gives opportunities for researchers to explore this sector as to place Malaysia as a provider in some key strategic industries, including medical and health sciences field. Academic researchers/medical researchers possess unique tacit and skills based in accordance with their experience and professional expert areas. In completing a collaborative research work, there must be platforms to enable the conversion of their knowledge hence beneficial towards creation of new knowledge. The objectives of this study are to: i) explore the knowledge creation activities of medical researchers in the Malaysian Higher Education Institution (HEI); ii) explore the driving forces for knowledge creation activities among the researchers; and iii) explore the interpretation of medical researchers on the establishment of ‘ba’ in the creation of knowledge. Based on the SECI model was introduced by Nonaka and Takeuchi and the Japanese concept of ‘ba’, a qualitative study whereby semi structured interview was used as to gather the informants’ viewpoints and insights based on their experience capitalizing ‘ba’ to support their knowledge creation activities. A single the study was conducted at one of the HEIs located in Sabah. From this study, both face to face and the ICT-assisted tools are found to be significant to support interaction of their knowledge. ICT seems to ease their interaction with other research collaborator. However, this study revealed that interaction conducted in physical settings is still be best preferred by the medical researchers especially situations of whereby their knowledge is hard to be externalized. Moreover, it revealed that motivational factors play important roles as for driving forces affecting their knowledge creation activities. Other than that, the medical researchers addressed that the mix interaction bring forth value in terms of facilitating knowledge creation. Therefore this study would benefit the institution to highly optimize the utilization of good platform so that knowledge can be transferred and be made used by others in appropriate ways.

Keywords: ‘ba’, knowledge creation dynamics, Malaysia, higher education institution, medical researchers

Procedia PDF Downloads 213
1236 Mapping Man-Induced Soil Degradation in Armenia's High Mountain Pastures through Remote Sensing Methods: A Case Study

Authors: A. Saghatelyan, Sh. Asmaryan, G. Tepanosyan, V. Muradyan

Abstract:

One of major concern to Armenia has been soil degradation emerged as a result of unsustainable management and use of grasslands, this in turn largely impacting environment, agriculture and finally human health. Hence, assessment of soil degradation is an essential and urgent objective set out to measure its possible consequences and develop a potential management strategy. Since recently, an essential tool for assessing pasture degradation has been remote sensing (RS) technologies. This research was done with an intention to measure preciseness of Linear spectral unmixing (LSU) and NDVI-SMA methods to estimate soil surface components related to degradation (fractional vegetation cover-FVC, bare soils fractions, surface rock cover) and determine appropriateness of these methods for mapping man-induced soil degradation in high mountain pastures. Taking into consideration a spatially complex and heterogeneous biogeophysical structure of the studied site, we used high resolution multispectral QuickBird imagery of a pasture site in one of Armenia’s rural communities - Nerkin Sasoonashen. The accuracy assessment was done by comparing between the land cover abundance data derived through RS methods and the ground truth land cover abundance data. A significant regression was established between ground truth FVC estimate and both NDVI-LSU and LSU - produced vegetation abundance data (R2=0.636, R2=0.625, respectively). For bare soil fractions linear regression produced a general coefficient of determination R2=0.708. Because of poor spectral resolution of the QuickBird imagery LSU failed with assessment of surface rock abundance (R2=0.015). It has been well documented by this particular research, that reduction in vegetation cover runs in parallel with increase in man-induced soil degradation, whereas in the absence of man-induced soil degradation a bare soil fraction does not exceed a certain level. The outcomes show that the proposed method of man-induced soil degradation assessment through FVC, bare soil fractions and field data adequately reflects the current status of soil degradation throughout the studied pasture site and may be employed as an alternate of more complicated models for soil degradation assessment.

Keywords: Armenia, linear spectral unmixing, remote sensing, soil degradation

Procedia PDF Downloads 326
1235 Effects of Group Cognitive Restructuring and Rational Emotive Behavioral Therapy on Psychological Distress of Awaiting-Trial Inmates in Correctional Centers in North-West, Nigeria

Authors: Muhammad Shafi’U Adamu

Abstract:

This study examined the effects of two groups of Cognitive Behavioral Therapies (CBT) which, includes Cognitive Restructuring (CB) and Rational Emotive Behavioral Therapy (REBT), on the Psychological Distress of awaiting-trial Inmates in Correctional Centers in North-West Nigeria. The study had four specific objectives, four research questions, and four null hypotheses. The study used a quasi-experimental design that involved pre-test and post-test. The population comprised of all 7,962 awaiting-trial inmates in correctional centers in North-west Nigeria. 131 awaiting trial inmates from three intact Correctional Centers were randomly selected using the census technique. The respondents were sampled and randomly put into 3 groups (CR, REBT and Control). Kessler Psychological Distress Scale (K10) was adapted for data collection in the study. The instrument was validated by experts and subjected to a pilot study using Cronbach's Alpha with a reliability coefficient of 0.772. Each group received treatment for 8 consecutive weeks (60 minutes/week). Data collected from the field were subjected to descriptive statistics of mean, standard deviation and mean difference to answer the research questions. Inferential statistics of ANOVA and independent sample t-test were used to test the null hypotheses at P≤ 0.05 level of significance. Results in the study revealed that there was no significant difference among the pre-treatment mean scores of experimental and control groups. Statistical evidence also showed a significant difference among the mean scores of the three groups, and thus, results of the Post Hoc multiple-comparison test indicated the posttreatment reduction of psychological distress in the awaiting-trial inmates. Documented output also showed a significant difference between the post-treatment psychologically distressed mean scores of male and female awaiting-trial inmates, but there was no difference in those exposed to REBT. The research recommends that a standardized structured CBT counseling technique treatment should be designed for correctional centers across Nigeria, and CBT counseling techniques could be used in the treatment of PD in both correctional and clinical settings.

Keywords: awaiting-trial inmates, cognitive restructuring, correctional centers, rational emotive behavioral therapy

Procedia PDF Downloads 74
1234 Concepts of Creation and Destruction as Cognitive Instruments in World View Study

Authors: Perizat Balkhimbekova

Abstract:

Evolutionary changes in cognitive world view taking place in the last decades are followed by changes in perception of the key concepts which are related to the certain lingua-cultural sphere. Also, such concepts reflect the person’s attitude to essential processes in the sphere of concepts, e.g. the opposite operations like creation and destruction. These changes in people’s life and thinking are displayed in a language world view. In order to open the maintenance of mental structures and concepts we should use language means as observable results of people’s cognitive activity. Semantics of words, free phrases and idioms should be considered as an authoritative source of information concerning concepts. The regularized set of concepts in people consciousness forms the sphere of concepts. Cognitive linguistics widely discusses the sphere of concepts as its crucial category defining it as the field of knowledge which is made of concepts. It is considered that a sphere of concepts comprises the various types of association and forms conceptual fields. As a material for the given research, the data from Russian National Corpus and British National Corpus were used. In is necessary to point out that data provided by computational studies, are intrinsic and verifiable; so that we have used them in order to get the reliable results. The procedure of study was based on such techniques as extracting of the context containing concepts of creation|destruction from the Russian National Corpus (RNC), and British National Corpus (BNC); analyzing and interpreting of those context on the basis of cognitive approach; finding of correspondence between the given concepts in the Russian and English world view. The key problem of our study is to find the correspondence between the elements of world view represented by opposite concepts such as creation and destruction. Findings: The concept of "destruction" indicates a process which leads to full or partial destruction of an object. In other words, it is a loss of the object primary essence: structures, properties, distinctive signs and its initial integrity. The concept of "creation", on the contrary, comprises positive characteristics, represents the activity aimed at improvement of the certain object, at the creation of ideal models of the world. On the other hand, destruction is represented much more widely in RNC than creation (1254 cases of the first concept by comparison to 192 cases for the second one). Our hypothesis consists in the antinomy represented by the aforementioned concepts. Being opposite both in respect of semantics and pragmatics, and from the point of view of axiology, they are at the same time complementary and interrelated concepts.

Keywords: creation, destruction, concept, world view

Procedia PDF Downloads 343
1233 Sensory Ethnography and Interaction Design in Immersive Higher Education

Authors: Anna-Kaisa Sjolund

Abstract:

The doctoral thesis examines interaction design and sensory ethnography as tools to create immersive education environments. In recent years, there has been increasing interest and discussions among researchers and educators on immersive education like augmented reality tools, virtual glasses and the possibilities to utilize them in education at all levels. Using virtual devices as learning environments it is possible to create multisensory learning environments. Sensory ethnography in this study refers to the way of the senses consider the impact on the information dynamics in immersive learning environments. The past decade has seen the rapid development of virtual world research and virtual ethnography. Christine Hine's Virtual Ethnography offers an anthropological explanation of net behavior and communication change. Despite her groundbreaking work, time has changed the users’ communication style and brought new solutions to do ethnographical research. The virtual reality with all its new potential has come to the fore and considering all the senses. Movie and image have played an important role in cultural research for centuries, only the focus has changed in different times and in a different field of research. According to Karin Becker, the role of image in our society is information flow and she found two meanings what the research of visual culture is. The images and pictures are the artifacts of visual culture. Images can be viewed as a symbolic language that allows digital storytelling. Combining the sense of sight, but also the other senses, such as hear, touch, taste, smell, balance, the use of a virtual learning environment offers students a way to more easily absorb large amounts of information. It offers also for teachers’ different ways to produce study material. In this article using sensory ethnography as research tool approaches the core question. Sensory ethnography is used to describe information dynamics in immersive environment through interaction design. Immersive education environment is understood as three-dimensional, interactive learning environment, where the audiovisual aspects are central, but all senses can be taken into consideration. When designing learning environments or any digital service, interaction design is always needed. The question what is interaction design is justified, because there is no simple or consistent idea of what is the interaction design or how it can be used as a research method or whether it is only a description of practical actions. When discussing immersive learning environments or their construction, consideration should be given to interaction design and sensory ethnography.

Keywords: immersive education, sensory ethnography, interaction design, information dynamics

Procedia PDF Downloads 136
1232 Transformation of the Institutionality of International Cooperation in Ecuador from 2007 to 2017: 2017: A Case of State Identity Affirmation through Role Performance

Authors: Natalia Carolina Encalada Castillo

Abstract:

As part of an intended radical policy change compared to former administrations in Ecuador, the transformation of the institutionality of international cooperation during the period of President Rafael Correa was considered as a key element for the construction of the state of 'Good Living'. This intention led to several regulatory changes in the reception of cooperation for development, and even the departure of some foreign cooperation agencies. Moreover, Ecuador launched the initiative to become a donor of cooperation towards other developing countries through the ‘South-South Cooperation’ approach. All these changes were institutionalized through the Ecuadorian System of International Cooperation as a new framework to establish rules and policies that guarantee a sovereign management of foreign aid. Therefore, this research project has been guided by two questions: What were the factors that motivated the transformation of the institutionality of international cooperation in Ecuador from 2007 to 2017? and, what were the implications of this transformation in terms of the international role of the country? This paper seeks to answer these questions through Role Theory within a Constructivist meta-theoretical perspective, considering that in this case, changes at the institutional level in the field of cooperation, responded not only to material motivations but also to interests built on the basis of a specific state identity. The latter was only possible to affirm through specific roles such as ‘sovereign recipient of cooperation’ as well as ‘donor of international cooperation’. However, the performance of these roles was problematic as they were not easily accepted by the other actors in the international arena or in the domestic level. In terms of methodology, these dynamics are analyzed in a qualitative way mainly through interpretive analysis of the discourse of high-level decision-makers from Ecuador and other cooperation actors. Complementary to this, document-based research of relevant information as well as interviews have been conducted. Finally, it is concluded that even if material factors such as infrastructure needs, trade and investment interests, as well as reinforcement of state control and monitoring of cooperation flows, motivated the institutional transformation of international cooperation in Ecuador; the essential basis of these changes was the search for a new identity for the country to be projected in the international arena. This identity started to be built but continues to be unstable. Therefore, it is important to potentiate the achievements of the new international cooperation policies, and review their weaknesses, so that non-reimbursable cooperation funds received as well as ‘South-South cooperation’ actions, contribute effectively to national objectives.

Keywords: Ecuador, international cooperation, Role Theory, state identity

Procedia PDF Downloads 211
1231 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 148
1230 Impact of Instrument Transformer Secondary Connections on Performance of Protection System: Experiences from Indian POWERGRID

Authors: Pankaj Kumar Jha, Mahendra Singh Hada, Brijendra Singh, Sandeep Yadav

Abstract:

Protective relays are commonly connected to the secondary windings of instrument transformers, i.e., current transformers (CTs) and/or capacitive voltage transformers (CVTs). The purpose of CT and CVT is to provide galvanic isolation from high voltages and reduce primary currents and voltages to a nominal quantity recognized by the protective relays. Selecting the correct instrument transformers for an application is imperative: failing to do so may compromise the relay’s performance, as the output of the instrument transformer may no longer be an accurately scaled representation of the primary quantity. Having an accurately rated instrument transformer is of no use if these devices are not properly connected. The performance of the protective relay is reliant on its programmed settings and on the current and voltage inputs from the instrument transformers secondary. This paper will help in understanding the fundamental concepts of the connections of Instrument Transformers to the protection relays and the effect of incorrect connection on the performance of protective relays. Multiple case studies of protection system mal-operations due to incorrect connections of instrument transformers will be discussed in detail in this paper. Apart from the connection issue of instrument transformers to protective relays, this paper will also discuss the effect of multiple earthing of CTs and CVTs secondary on the performance of the protection system. Case studies presented in this paper will help the readers to analyse the problem through real-world challenges in complex power system networks. This paper will also help the protection engineer in better analysis of disturbance records. CT and CVT connection errors can lead to undesired operations of protection systems. However, many of these operations can be avoided by adhering to industry standards and implementing tried-and-true field testing and commissioning practices. Understanding the effect of missing neutral of CVT, multiple earthing of CVT secondary, and multiple grounding of CT star points on the performance of the protection system through real-world case studies will help the protection engineer in better commissioning the protection system and maintenance of the protection system.

Keywords: bus reactor, current transformer, capacitive voltage transformer, distance protection, differential protection, directional earth fault, disturbance report, instrument transformer, ICT, REF protection, shunt reactor, voltage selection relay, VT fuse failure

Procedia PDF Downloads 81
1229 Impact of Integrated Watershed Management Programme Based on Four Waters Concept: A Case Study of Sali Village, Rajasthan State of India

Authors: Garima Sharma, R. N. Sharma

Abstract:

Integrated watershed management programme based on 'Four Water Concept' was implemented in Sali village, in Jaipur District, Rajasthan State of India . The latitude 26.7234486 North and longitude 75.023876 East are the geocoordinate of the Sali. 'Four Waters Concept' is evolved by integrating the 'Four Waters', viz. rain water, soil moisture, ground water and surface water This methodology involves various water harvesting techniques to prevent the runoff of water by treatment of catchment, proper utilization of available water harvesting structures, renovation of the non-functional water harvesting structures and creation of new water harvesting structures. The case study included questionnaire survey from farmers and continuous study of village for two years. The total project area is 6153 Hac, and the project cost is Rs. 92.25 million. The sanctioned area of Sali Micro watershed is 2228 Hac with an outlay of Rs. 10.52 million. Watershed treatment activities such as water absorption trench, continuous contour trench, field bunding, check dams, were undertaken on agricultural lands for soil and water conservation. These measures have contributed in preventing runoff and increased the perennial availability of water in wells. According to the survey, water level in open wells in the area has risen by approximately 5 metres after the introduction of water harvesting structures. The continuous availability of water in wells has increased the area under irrigation and helped in crop diversification. Watershed management activities have brought the changes in cropping patterns and crop productivity. It helped in transforming 567 Hac culturable waste land into culturable arable land in the village. The farmers of village have created an additional income from the increased crop production. The programme also assured the availability of water during peak summers for the day to day activities of villagers. The outcomes indicate that there is positive impact of watershed management practices on the water resource potential as well the crop production of the area. This suggests that persistent efforts in this direction may lead to sustainability of the watershed.

Keywords: four water concept, groundwater potential, irrigation potential, watershed management

Procedia PDF Downloads 355
1228 Application of Ground-Penetrating Radar in Environmental Hazards

Authors: Kambiz Teimour Najad

Abstract:

The basic methodology of GPR involves the use of a transmitting antenna to send electromagnetic waves into the subsurface, which then bounce back to the surface and are detected by a receiving antenna. The transmitter and receiver antennas are typically placed on the ground surface and moved across the area of interest to create a profile of the subsurface. The GPR system consists of a control unit that powers the antennas and records the data, as well as a display unit that shows the results of the survey. The control unit sends a pulse of electromagnetic energy into the ground, which propagates through the soil or rock until it encounters a change in material or structure. When the electromagnetic wave encounters a buried object or structure, some of the energy is reflected back to the surface and detected by the receiving antenna. The GPR data is then processed using specialized software that analyzes the amplitude and travel time of the reflected waves. By interpreting the data, GPR can provide information on the depth, location, and nature of subsurface features and structures. GPR has several advantages over other geophysical survey methods, including its ability to provide high-resolution images of the subsurface and its non-invasive nature, which minimizes disruption to the site. However, the effectiveness of GPR depends on several factors, including the type of soil or rock, the depth of the features being investigated, and the frequency of the electromagnetic waves used. In environmental hazard assessments, GPR can be used to detect buried structures, such as underground storage tanks, pipelines, or utilities, which may pose a risk of contamination to the surrounding soil or groundwater. GPR can also be used to assess soil stability by identifying areas of subsurface voids or sinkholes, which can lead to the collapse of the surface. Additionally, GPR can be used to map the extent and movement of groundwater contamination, which is critical in designing effective remediation strategies. the methodology of GPR in environmental hazard assessments involves the use of electromagnetic waves to create high of the subsurface, which are then analyzed to provide information on the depth, location, and nature of subsurface features and structures. This information is critical in identifying and mitigating environmental hazards, and the non-invasive nature of GPR makes it a valuable tool in this field.

Keywords: GPR, hazard, landslide, rock fall, contamination

Procedia PDF Downloads 80
1227 The Role of Dialogue in Shared Leadership and Team Innovative Behavior Relationship

Authors: Ander Pomposo

Abstract:

Purpose: The aim of this study was to investigate the impact that dialogue has on the relationship between shared leadership and innovative behavior and the importance of dialogue in innovation. This study wants to contribute to the literature by providing theorists and researchers a better understanding of how to move forward in the studies of moderator variables in the relationship between shared leadership and team outcomes such as innovation. Methodology: A systematic review of the literature, originally adopted from the medical sciences but also used in management and leadership studies, was conducted to synthesize research in a systematic, transparent and reproducible manner. A final sample of 48 empirical studies was scientifically synthesized. Findings: Shared leadership gives a better solution to team management challenges and goes beyond the classical, hierarchical, or vertical leadership models based on the individual leader approach. One of the outcomes that emerge from shared leadership is team innovative behavior. To intensify the relationship between shared leadership and team innovative behavior, and understand when is more effective, the moderating effects of other variables in this relationship should be examined. This synthesis of the empirical studies revealed that dialogue is a moderator variable that has an impact on the relationship between shared leadership and team innovative behavior when leadership is understood as a relational process. Dialogue is an activity between at least two speech partners trying to fulfill a collective goal and is a way of living open to people and ideas through interaction. Dialogue is productive when team members engage relationally with one another. When this happens, participants are more likely to take responsibility for the tasks they are involved and for the relationships they have with others. In this relational engagement, participants are likely to establish high-quality connections with a high degree of generativity. This study suggests that organizations should facilitate the dialogue of team members in shared leadership which has a positive impact on innovation and offers a more adaptive framework for the leadership that is needed in teams working in complex work tasks. These results uncover the necessity of more research on the role that dialogue plays in contributing to important organizational outcomes such as innovation. Case studies describing both best practices and obstacles of dialogue in team innovative behavior are necessary to gain a more detailed insight into the field. It will be interesting to see how all these fields of research evolve and are implemented in dialogue practices in the organizations that use team-based structures to deal with uncertainty, fast-changing environments, globalization and increasingly complex work.

Keywords: dialogue, innovation, leadership, shared leadership, team innovative behavior

Procedia PDF Downloads 179
1226 Experimental Proof of Concept for Piezoelectric Flow Harvesting for In-Pipe Metering Systems

Authors: Sherif Keddis, Rafik Mitry, Norbert Schwesinger

Abstract:

Intelligent networking of devices has rapidly been gaining importance over the past years and with recent advances in the fields of microcontrollers, integrated circuits and wireless communication, low power applications have emerged, enabling this trend even more. Connected devices provide a much larger database thus enabling highly intelligent and accurate systems. Ensuring safe drinking water is one of the fields that require constant monitoring and can benefit from an increased accuracy. Monitoring is mainly achieved either through complex measures, such as collecting samples from the points of use, or through metering systems typically distant to the points of use which deliver less accurate assessments of the quality of water. Constant metering near the points of use is complicated due to their inaccessibility; e.g. buried water pipes, locked spaces, which makes system maintenance extremely difficult and often unviable. The research presented here attempts to overcome this challenge by providing these systems with enough energy through a flow harvester inside the pipe thus eliminating the maintenance requirements in terms of battery replacements or containment of leakage resulting from wiring such systems. The proposed flow harvester exploits the piezoelectric properties of polyvinylidene difluoride (PVDF) films to convert turbulence induced oscillations into electrical energy. It is intended to be used in standard water pipes with diameters between 0.5 and 1 inch. The working principle of the harvester uses a ring shaped bluff body inside the pipe to induce pressure fluctuations. Additionally the bluff body houses electronic components such as storage, circuitry and RF-unit. Placing the piezoelectric films downstream of that bluff body causes their oscillation which generates electrical charge. The PVDF-film is placed as a multilayered wrap fixed to the pipe wall leaving the top part to oscillate freely inside the flow. The warp, which allows for a larger active, consists of two layers of 30µm thick and 12mm wide PVDF layered alternately with two centered 6µm thick and 8mm wide aluminum foil electrodes. The length of the layers depends on the number of windings and is part of the investigation. Sealing the harvester against liquid penetration is achieved by wrapping it in a ring-shaped LDPE-film and welding the open ends. The fabrication of the PVDF-wraps is done by hand. After validating the working principle using a wind tunnel, experiments have been conducted in water, placing the harvester inside a 1 inch pipe at water velocities of 0.74m/s. To find a suitable placement of the wrap inside the pipe, two forms of fixation were compared regarding their power output. Further investigations regarding the number of windings required for efficient transduction were made. Best results were achieved using a wrap with 3 windings of the active layers which delivers a constant power output of 0.53µW at a 2.3MΩ load and an effective voltage of 1.1V. Considering the extremely low power requirements of sensor applications, these initial results are promising. For further investigations and optimization, machine designs are currently being developed to automate the fabrication and decrease tolerance of the prototypes.

Keywords: maintenance-free sensors, measurements at point of use, piezoelectric flow harvesting, universal micro generator, wireless metering systems

Procedia PDF Downloads 192