Search results for: T. Pedersen
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11

Search results for: T. Pedersen

11 Xiao Qian’s Chinese-To-English Self-Translation in the 1940s

Authors: Xiangyu Yang

Abstract:

Xiao Qian (1910-1999) was a prolific literary translator between Chinese and English in both directions and an influential commentator on Chinese translation practices for nearly 70 years (1931-1998). During his stay in Britain from 1939 to 1946, Xiao self-translated and published a series of short stories, essays, and feature articles. With Pedersen's theoretical framework, the paper finds that Xiao flexibly adopted seven translation strategies (i.e. phonemic retention, specification, direct translation, generalization, substitution, omission, and official equivalent) to deal with the expressions specific to Chinese culture, struggling to seek a balance between adequate translation and acceptable translation in a historical condition of the huge gap between China and the west in the early twentieth century. Besides, the study also discovers that Xiao's translation strategies were greatly influenced by his own translational purpose as well as the literary systems, ideologies, and patronage in China and Britain in the 1940s.

Keywords: self-translation, extralinguistic cultural reference, Xiao Qian, Pedersen

Procedia PDF Downloads 66
10 Analysis of Temporal Factors Influencing Minimum Dwell Time Distributions

Authors: T. Pedersen, A. Lindfeldt

Abstract:

The minimum dwell time is an important part of railway timetable planning. Due to its stochastic behaviour, the minimum dwell time should be considered to create resilient timetables. While there has been significant focus on how to determine and estimate dwell times, to our knowledge, little research has been carried out regarding temporal and running direction variations of these. In this paper, we examine how the minimum dwell time varies depending on temporal factors such as the time of day, day of the week and time of the year. We also examine how it is affected by running direction and station type. The minimum dwell time is estimated by means of track occupation data. A method is proposed to ensure that only minimum dwell times and not planned dwell times are acquired from the track occupation data. The results show that on an aggregated level, the average minimum dwell times in both running directions at a station are similar. However, when temporal factors are considered, there are significant variations. The minimum dwell time varies throughout the day with peak hours having the longest dwell times. It is also found that the minimum dwell times are influenced by weekday, and in particular, weekends are found to have lower minimum dwell times than most other days. The findings show that there is a potential to significantly improve timetable planning by taking minimum dwell time variations into account.

Keywords: minimum dwell time, operations quality, timetable planning, track occupation data

Procedia PDF Downloads 157
9 Investigation on the Structure of Temperature-Responsive N-isopropylacrylamide Microgels Containing a New Hydrophobic Crosslinker

Authors: G. Roshan Deen, J. S. Pedersen

Abstract:

Temperature-responsive poly(N-isopropyl acrylamide) PNIPAM microgels crosslinked with a new hydrophobic chemical crosslinker was prepared by surfactant-mediated precipitation emulsion polymerization. The temperature-responsive property of the microgel and the influence of the crosslinker on the swelling behaviour was studied systematically by light scattering and small-angle X-ray scattering (SAXS). The radius of gyration (Rg) and the hydrodynamic radius (Rh) of the microgels decreased with increase in temperature due to the volume phase transition from a swollen to a collapsed state. The ratio of Rg/Rh below the transition temperature was lower than that of hard-spheres due to the lower crosslinking density of the microgels. The SAXS data was analysed by a model in which the microgels were modelled as core-shell particles with a graded interface. The model at intermediate temperatures included a central core and a more diffuse outer layer describing pending polymer chains with a low crosslinking density. In the fully swollen state, the microgels were modelled with a single component with a broad graded surface. In the collapsed state they were modelled as homogeneous and relatively compact particles. The polymer volume fraction inside the microgel was also derived based on the model and was found to increase with increase in temperature as a result of collapse of the microgel to compact particles. The polymer volume fraction in the core of the microgel in the collapsed state was about 60% which is higher than that of similar microgels crosslinked with hydrophilic and flexible cross-linkers.

Keywords: microgels, SAXS, hydrophobic crosslinker, light scattering

Procedia PDF Downloads 398
8 Effect of CYP2B6 c.516G>T and c.983T>C Single Nucleotide Polymorphisms on Plasma Nevirapine Levels in Zimbabwean HIV/AIDS Patients

Authors: Doreen Duri, Danai Zhou, Babil Stray-Pedersen, Collet Dandara

Abstract:

Given the high prevalence of HIV/AIDS in sub-Saharan Africa, and the elusive search for a cure, understanding the pharmacogenetics of currently used drugs is critical in populations from the most affected regions. Compared to Asian and Caucasian populations, African population groups are more genetically diverse, making it difficult to extrapolate findings from one ethnic group to another. This study aimed to investigate the role of genetic variation in CYP2B6 (c.516G>T and c.983T>C) single nucleotide polymorphisms on plasma nevirapine levels among HIV-infected adult Zimbabwean patients. Using a cross-sectional study, patients on nevirapine-containing HAART, having reached steady state (more than six weeks on treatment) were recruited to participate. Blood samples were collected after patients provided consent and samples were used to extract DNA for genetic analysis or to measure plasma nevirapine levels. Genetic analysis was carried out using PCR and RFLP or Snapshot for the two single nucleotide polymorphisms; CYP2B6 c.516G>T and c.983T>C, while LC-MS/MS was used in analyzing nevirapine concentration. CYP2B6 c.516G>T and c.983T>C significantly predicted plasma nevirapine concentration with the c.516T and c.983T being associated with elevated plasma nevirapine concentrations. Comparisons of the variant allele frequencies observed in this group to those reported in some African, Caucasian and Asian populations showed significant differences. We conclude that pharmacogenetics of nevirapine can be creatively used to determine patients who are likely to develop nevirapine-associated side effects as well as too low plasma concentrations for viral suppression.

Keywords: allele frequencies, genetically diverse, nevirapine, single nucleotide polymorphism

Procedia PDF Downloads 420
7 Embracing Complex Femininity: A Comparative Analysis of the Representation of Female Sexuality in John Webster and William Faulkner

Authors: Elisabeth Pedersen

Abstract:

Representations and interpretations of womanhood and female sexualities bring forth various questions regarding gender norms, and the implications of these norms, which are permeating and repetitive within various societies. Literature is one form of media which provides the space to represent and interpret women, their bodies, and sexualities, and also reveals the power of language as an affective and affected force. As literature allows an opportunity to explore history and the representations of gender, power dynamics, and sexuality through historical contexts, this paper uses engaged theory through a comparative analysis of two work of literature, The Duchess of Malfi by John Wester, and The Sound and the Fury by William Faulkner. These novels span across space and time, which lends to the theory that repetitive tropes of womanhood and female sexuality in literature are influenced by and have an influence on the hegemonic social order throughout history. It analyzes how the representation of the dichotomy of male chivalry and honor, and female purity are disputed and questioned when a woman is portrayed as sexually emancipated, and explores the historical context in which these works were written to examine how socioeconomic events challenged the hegemonic social order. The analysis looks at how stereotypical ideals of womanhood and manhood have damaging implications on women, as the structure of society provides more privilege and power to men than to women, thus creating a double standard for men and women in regards to sexuality, sexual expression, and rights to sexual desire. This comparative analysis reveals how strict gender norms are permeating and have negative consequences. However, re-reading stories through a critical lens can provide an opportunity to challenge the repetitive tropes of female sexuality, and thus lead to the embrace of the complexity of female sexuality and expression.

Keywords: femininity, literature, representation, sexuality

Procedia PDF Downloads 316
6 Workers’ Prevention from Occupational Chemical Exposures during Container Handling

Authors: Balázs Ádám, Randi Nørgaard Fløe Pedersen, Jørgen Riis Jepsen

Abstract:

Volatile chemicals that accumulate and release from freight containers constitute significant health risks. Fumigation to prevent spread of pests and off-gassing of freight are sources of hazardous chemicals. The aim of our study was to investigate the regulation and practice of container handling with focus on preventive measures applied against chemical exposures in Denmark. A comprehensive systematic search of scientific literature and organizational domains of international and Danish regulatory bodies was performed to explore regulations related to safe work with transport containers. The practice of container work was investigated in a series of semi-structured interviews with managers and health and safety representatives of organizations that handle transport containers. Although there are several international and national regulations and local safety instructions that relate to container handling, the provided information is not specific or up-to-date enough to conduct safe practice in many aspects. The interviewees estimate high frequency of containers with chemical exposure and deem that they can potentially damage health, although recognizable health effects are rare. Knowledge is limited about the chemicals and most of them cannot be measured by available devices. Typical preventive measures are passive ventilation and personal protective equipment but their use is not consistent and may not provide adequate protection. Hazardous chemicals are frequently present in transport containers; however, managers, workers and even occupational health professionals have limited knowledge about the problem. Detailed risk assessment and specific instructions on risk management are needed to provide safe conditions for work with containers.

Keywords: chemical exposure, fumigation, occupational health and safety regulation, transport container

Procedia PDF Downloads 352
5 Speckle-Based Phase Contrast Micro-Computed Tomography with Neural Network Reconstruction

Authors: Y. Zheng, M. Busi, A. F. Pedersen, M. A. Beltran, C. Gundlach

Abstract:

X-ray phase contrast imaging has shown to yield a better contrast compared to conventional attenuation X-ray imaging, especially for soft tissues in the medical imaging energy range. This can potentially lead to better diagnosis for patients. However, phase contrast imaging has mainly been performed using highly brilliant Synchrotron radiation, as it requires high coherence X-rays. Many research teams have demonstrated that it is also feasible using a laboratory source, bringing it one step closer to clinical use. Nevertheless, the requirement of fine gratings and high precision stepping motors when using a laboratory source prevents it from being widely used. Recently, a random phase object has been proposed as an analyzer. This method requires a much less robust experimental setup. However, previous studies were done using a particular X-ray source (liquid-metal jet micro-focus source) or high precision motors for stepping. We have been working on a much simpler setup with just small modification of a commercial bench-top micro-CT (computed tomography) scanner, by introducing a piece of sandpaper as the phase analyzer in front of the X-ray source. However, it needs a suitable algorithm for speckle tracking and 3D reconstructions. The precision and sensitivity of speckle tracking algorithm determine the resolution of the system, while the 3D reconstruction algorithm will affect the minimum number of projections required, thus limiting the temporal resolution. As phase contrast imaging methods usually require much longer exposure time than traditional absorption based X-ray imaging technologies, a dynamic phase contrast micro-CT with a high temporal resolution is particularly challenging. Different reconstruction methods, including neural network based techniques, will be evaluated in this project to increase the temporal resolution of the phase contrast micro-CT. A Monte Carlo ray tracing simulation (McXtrace) was used to generate a large dataset to train the neural network, in order to address the issue that neural networks require large amount of training data to get high-quality reconstructions.

Keywords: micro-ct, neural networks, reconstruction, speckle-based x-ray phase contrast

Procedia PDF Downloads 218
4 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging

Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen

Abstract:

Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.

Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques

Procedia PDF Downloads 65
3 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams

Authors: Mette Pedersen

Abstract:

This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.

Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing

Procedia PDF Downloads 105
2 Investigation of Cavitation in a Centrifugal Pump Using Synchronized Pump Head Measurements, Vibration Measurements and High-Speed Image Recording

Authors: Simon Caba, Raja Abou Ackl, Svend Rasmussen, Nicholas E. Pedersen

Abstract:

It is a challenge to directly monitor cavitation in a pump application during operation because of a lack of visual access to validate the presence of cavitation and its form of appearance. In this work, experimental investigations are carried out in an inline single-stage centrifugal pump with optical access. Hence, it gives the opportunity to enhance the value of CFD tools and standard cavitation measurements. Experiments are conducted using two impellers running in the same volute at 3000 rpm and the same flow rate. One of the impellers used is optimized for lower NPSH₃% by its blade design, whereas the other one is manufactured using a standard casting method. The cavitation is detected by pump performance measurements, vibration measurements and high-speed image recordings. The head drop and the pump casing vibration caused by cavitation are correlated with the visual appearance of the cavitation. The vibration data is recorded in an axial direction of the impeller using accelerometers recording at a sample rate of 131 kHz. The vibration frequency domain data (up to 20 kHz) and the time domain data are analyzed as well as the root mean square values. The high-speed recordings, focusing on the impeller suction side, are taken at 10,240 fps to provide insight into the flow patterns and the cavitation behavior in the rotating impeller. The videos are synchronized with the vibration time signals by a trigger signal. A clear correlation between cloud collapses and abrupt peaks in the vibration signal can be observed. The vibration peaks clearly indicate cavitation, especially at higher NPSHA values where the hydraulic performance is not affected. It is also observed that below a certain NPSHA value, the cavitation started in the inlet bend of the pump. Above this value, cavitation occurs exclusively on the impeller blades. The impeller optimized for NPSH₃% does show a lower NPSH₃% than the standard impeller, but the head drop starts at a higher NPSHA value and is more gradual. Instabilities in the head drop curve of the optimized impeller were observed in addition to a higher vibration level. Furthermore, the cavitation clouds on the suction side appear more unsteady when using the optimized impeller. The shape and location of the cavitation are compared to 3D fluid flow simulations. The simulation results are in good agreement with the experimental investigations. In conclusion, these investigations attempt to give a more holistic view on the appearance of cavitation by comparing the head drop, vibration spectral data, vibration time signals, image recordings and simulation results. Data indicates that a criterion for cavitation detection could be derived from the vibration time-domain measurements, which requires further investigation. Usually, spectral data is used to analyze cavitation, but these investigations indicate that the time domain could be more appropriate for some applications.

Keywords: cavitation, centrifugal pump, head drop, high-speed image recordings, pump vibration

Procedia PDF Downloads 147
1 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game

Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin

Abstract:

Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.

Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design

Procedia PDF Downloads 395