Search results for: English first additional language learners
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7069

Search results for: English first additional language learners

1339 Chinese Travelers’ Outbound Intentions to Visit Short-and-Long Haul Destinations: The Impact of Cultural Distance

Authors: Lei Qin

Abstract:

Culture has long been recognized as a possible reason to influence travelers’ decisions, which explains why travelers in different countries make distinct decisions. Cultural distance is a concept illustrating how much difference there is between travelers’ home culture and that of the destination, but the research in distinguishing short-and-long haul travel destinations is limited. This study explored the research gap by examining the impact of cultural distance on Chinese travelers’ intentions to visit short-haul and long-haul destinations, respectively. Six cultural distance measurements, including five measurements calculated from secondary database (Kogut & Singh, Developed Kogut & Singh, Euclidean distance Index (EDI), world value survey index (WVS), social axioms measurement (SAM)) and perceived cultural distance (PCD) collected from the primary survey. Of the six measurements, culture distance has the opposite impact on Chinese outbound travelers’ intentions in the short-haul and long haul. For short-haul travel, travelers’ intentions for traveling can be positive influenced by cultural distance; a possible reason is that travelers’ novelty-seeking satisfaction is greater than the strangeness obtained from overseas regions. For long-haul travel, travelers’ intentions for traveling can be negative influenced by cultural distance, a possible explanation is that travelers’ uncertainty, risk, and language concerns of farther destinations.

Keywords: cultural distance, intention, outbound travel, short-long haul

Procedia PDF Downloads 199
1338 A Critical Examination of the Relationship between the Media and the Political Agenda in the Social Deviance Portrayal of Disabled People

Authors: Cara Williams

Abstract:

This paper considers the media’s role in formulating a dominant social deviance paradigm and medicalised portrayal of disabled people and examines how those representations of impairment reinforce the personal tragedy view that underpins the social value given to the category of disability. According to a materialist perspective, the personal tragedy medical model approach condemns disabled people to live an inferior 'life apart', socially excluded and prevented from living as fully participating citizens on an equal basis to non-disabled people. Commonly, disabled people are portrayed as a person who needs to be cured in order to achieve a better 'quality of life'; otherwise stories center on deviance, criminality or scrounger. Media representations have consistently used negative language and images that reinforce the personal tragedy 'deficient' view of disability. The systematic misrepresentation within film, literature, TV and other art forms have validated a process about what it means to be 'normal' and how 'difference' and 'identity' are interpreted. The impact of these stereotyped disabling images for disabled people is a barrier not experienced by many other oppressed minority groups. Applying a materialist analysis, this paper contends that the impact on audience’s perceptions of impaired bodies and minds, and the harmful effects on disabled people can be linked with agenda setting theory - the relationship between the media and the political agenda.

Keywords: media, disabled people, political agenda, personal tragedy

Procedia PDF Downloads 138
1337 Casusation and Criminal Responsibility

Authors: László Schmidt

Abstract:

“Post hoc ergo propter hoc” means after it, therefore because of it. In other words: If event Y followed event X, then event Y must have been caused by event X. The question of causation has long been a central theme in philosophical thought, and many different theories have been put forward. However, causality is an essentially contested concept (ECC), as it has no universally accepted definition and is used differently in everyday, scientific, and legal thinking. In the field of law, the question of causality arises mainly in the context of establishing legal liability: in criminal law and in the rules of civil law on liability for damages arising either from breach of contract or from tort. In the study some philosophical theories of causality will be presented and how these theories correlate with legal causality. It’s quite interesting when philosophical abstractions meet the pragmatic demands of jurisprudence. In Hungarian criminal judicial practice the principle of equivalence of conditions is the generally accepted and applicable standard of causation, where all necessary conditions are considered equivalent and thus a cause. The idea is that without the trigger, the subsequent outcome would not have occurred; all the conditions that led to the subsequent outcome are equivalent. In the case where the trigger that led to the result is accompanied by an additional intervening cause, including an accidental one, independent of the perpetrator, the causal link is not broken, but at most the causal link becomes looser. The importance of the intervening causes in the outcome should be given due weight in the imposition of the sentence. According to court practice if the conduct of the offender sets in motion the causal process which led to the result, it does not exclude his criminal liability and does not interrupt the causal process if other factors, such as the victim's illness, may have contributed to it. The concausa does not break the chain of causation, i.e. the existence of a causal link establish the criminal liability of the offender. Courts also adjudicates that if an act is a cause of the result if the act cannot be omitted without the result being omitted. This essentially assumes a hypothetical elimination procedure, i.e. the act must be omitted in thought and then examined to see whether the result would still occur or whether it would be omitted. On the substantive side, the essential condition for establishing the offence is that the result must be demonstrably connected with the activity committed. The provision on the assessment of the facts beyond reasonable doubt must also apply to the causal link: that is to say, the uncertainty of the causal link between the conduct and the result of the offence precludes the perpetrator from being held liable for the result. Sometimes, however, the courts do not specify in the reasons for their judgments what standard of causation they apply, i.e. on what basis they establish the existence of (legal) causation.

Keywords: causation, Hungarian criminal law, responsibility, philosophy of law

Procedia PDF Downloads 32
1336 Neuron Efficiency in Fluid Dynamics and Prediction of Groundwater Reservoirs'' Properties Using Pattern Recognition

Authors: J. K. Adedeji, S. T. Ijatuyi

Abstract:

The application of neural network using pattern recognition to study the fluid dynamics and predict the groundwater reservoirs properties has been used in this research. The essential of geophysical survey using the manual methods has failed in basement environment, hence the need for an intelligent computing such as predicted from neural network is inevitable. A non-linear neural network with an XOR (exclusive OR) output of 8-bits configuration has been used in this research to predict the nature of groundwater reservoirs and fluid dynamics of a typical basement crystalline rock. The control variables are the apparent resistivity of weathered layer (p1), fractured layer (p2), and the depth (h), while the dependent variable is the flow parameter (F=λ). The algorithm that was used in training the neural network is the back-propagation coded in C++ language with 300 epoch runs. The neural network was very intelligent to map out the flow channels and detect how they behave to form viable storage within the strata. The neural network model showed that an important variable gr (gravitational resistance) can be deduced from the elevation and apparent resistivity pa. The model results from SPSS showed that the coefficients, a, b and c are statistically significant with reduced standard error at 5%.

Keywords: gravitational resistance, neural network, non-linear, pattern recognition

Procedia PDF Downloads 209
1335 Reviewers’ Perception of the Studio Jury System: How They View its Value in Architecture and Design Education

Authors: Diane M. Bender

Abstract:

In architecture and design education, students learn and understand their discipline through lecture courses and within studios. A studio is where the instructor works closely with students to help them understand design by doing design work. The final jury is the culmination of the studio learning experience. It’s value and significance are rarely questioned. Students present their work before their peers, instructors, and invited reviewers, known as jurors. These jurors are recognized experts who add a breadth of feedback to students mostly in the form of a verbal critique of the work. Since the design review or jury has been a common element of studio education for centuries, jurors themselves have been instructed in this format. Therefore, they understand its value from both a student and a juror perspective. To better understand how these reviewers see the value of a studio review, a survey was distributed to reviewers at a multi-disciplinary design school within the United States. Five design disciplines were involved in this case study: architecture, graphic design, industrial design, interior design, and landscape architecture. Respondents (n=108) provided written comments about their perceived value of the studio review system. The average respondent was male (64%), between 40-49 years of age, and has attained a master’s degree. Qualitative analysis with thematic coding revealed several themes. Reviewers view the final jury as important because it provides a variety of perspectives from unbiased external practitioners and prepares students for similar presentation challenges they will experience in professional practice. They also see it as a way to validate the assessment and evaluation of students by faculty. In addition, they see a personal benefit for themselves and their firm – the ability to network with fellow jurors, professors, and students (i.e., future colleagues). Respondents also provided additional feedback about the jury system and studio education in general. Typical responses included a desire for earlier engagement with students; a better explanation from the instructor about the project parameters, rubrics/grading, and guidelines for juror involvement; a way to balance giving encouraging feedback versus overly critical comments; and providing training for jurors prior to reviews. While this study focused on the studio review, the findings are equally applicable to other disciplines. Suggestions will be provided on how to improve the preparation of guests in the learning process and how their interaction can positively influence student engagement.

Keywords: assessment, design, jury, studio

Procedia PDF Downloads 59
1334 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks

Authors: Naghmeh Moradpoor Sheykhkanloo

Abstract:

Structured Query Language Injection (SQLI) attack is a code injection technique in which malicious SQL statements are inserted into a given SQL database by simply using a web browser. Losing data, disclosing confidential information or even changing the value of data are the severe damages that SQLI attack can cause on a given database. SQLI attack has also been rated as the number-one attack among top ten web application threats on Open Web Application Security Project (OWASP). OWASP is an open community dedicated to enabling organisations to consider, develop, obtain, function, and preserve applications that can be trusted. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLI attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLI attack categories, and an NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLI attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.

Keywords: neural networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection

Procedia PDF Downloads 463
1333 The Structure of Financial Regulation: The Regulators Perspective

Authors: Mohamed Aljarallah, Mohamed Nurullah, George Saridakis

Abstract:

This paper aims and objectives are to investigate how the structural change of the financial regulatory bodies affect the financial supervision and how the regulators can design such a structure with taking into account; the Central Bank, the conduct of business and the prudential regulators, it will also consider looking at the structure of the international regulatory bodies and what barriers are found. There will be five questions to be answered; should conduct of business and prudential regulation be separated? Should the financial supervision and financial stability be separated? Should the financial supervision be under the Central Bank? To what extent the politician should intervene in changing the regulatory and supervisory structure? What should be the regulatory and supervisory structure when there is financial conglomerate? Semi structure interview design will be applied. This research sample selection contains a collective of financial regulators and supervisors from the emerged and emerging countries. Moreover, financial regulators and supervisors must be at a senior level at their organisations. Additionally, senior financial regulators and supervisors would come from different authorities and from around the world. For instance, one of the participants comes from the International Bank Settlements, others come from European Central Bank, and an additional one will come from Hong Kong Monetary Authority and others. Such a variety aims to fulfil the aims and objectives of the research and cover the research questions. The analysis process starts with transcription of the interview, using Nvivo software for coding, applying thematic interview to generate the main themes. The major findings of the study are as follow. First, organisational structure changes quite frequently if the mandates are not clear. Second, measuring structural change is difficult, which makes the whole process unclear. Third, effective coordination and communication are what regulators looking for when they change the structure and that requires; openness, trust, and incentive. In addition to that, issues appear during the event of crisis tend to be the reason why the structure change. Also, the development of the market sometime causes a change in the regulatory structure. And, some structural change occurs simply because of the international trend, fashion, or other countries' experiences. Furthermore, when the top management change the structure tends to change. Moreover, the structure change due to the political change, or politicians try to show they are doing something. Finally, fear of being blamed can be a driver of structural change. In conclusion, this research aims to provide an insight from the senior regulators and supervisors from fifty different countries to have a clear understanding of why the regulatory structure keeps changing from time to time through a qualitative approach, namely, semi-structure interview.

Keywords: financial regulation bodies, financial regulatory structure, global financial regulation, financial crisis

Procedia PDF Downloads 137
1332 Bionaut™: A Minimally Invasive Microsurgical Platform to Treat Non-Communicating Hydrocephalus in Dandy-Walker Malformation

Authors: Suehyun Cho, Darrell Harrington, Florent Cros, Olin Palmer, John Caputo, Michael Kardosh, Eran Oren, William Loudon, Alex Kiselyov, Michael Shpigelmacher

Abstract:

The Dandy-Walker malformation (DWM) represents a clinical syndrome manifesting as a combination of posterior fossa cyst, hypoplasia of the cerebellar vermis, and obstructive hydrocephalus. Anatomic hallmarks include hypoplasia of the cerebellar vermis, enlargement of the posterior fossa, and cystic dilatation of the fourth ventricle. Current treatments of DWM, including shunting of the cerebral spinal fluid ventricular system and endoscopic third ventriculostomy (ETV), are frequently clinically insufficient, require additional surgical interventions, and carry risks of infections and neurological deficits. Bionaut Labs develops an alternative way to treat Dandy-Walker Malformation (DWM) associated with non-communicating hydrocephalus. We utilize our discreet microsurgical Bionaut™ particles that are controlled externally and remotely to perform safe, accurate, effective fenestration of the Dandy-Walker cyst, specifically in the posterior fossa of the brain, to directly normalize intracranial pressure. Bionaut™ allows for complex non-linear trajectories not feasible by any conventional surgical techniques. The microsurgical particle safely reaches targets in the lower occipital section of the brain. Bionaut™ offers a minimally invasive surgical alternative to highly involved posterior craniotomy or shunts via direct fenestration of the fourth ventricular cyst at the locus defined by the individual anatomy. Our approach offers significant advantages over the current standards of care in patients exhibiting anatomical challenge(s) as a manifestation of DWM, and therefore, is intended to replace conventional therapeutic strategies. Current progress, including platform optimization, Bionaut™ control, and real-time imaging and in vivo safety studies of the Bionauts™ in large animals, specifically the spine and the brain of ovine models, will be discussed.

Keywords: Bionaut™, cerebral spinal fluid, CSF, cyst, Dandy-Walker, fenestration, hydrocephalus, micro-robot

Procedia PDF Downloads 216
1331 Deep Injection Wells for Flood Prevention and Groundwater Management

Authors: Mohammad R. Jafari, Francois G. Bernardeau

Abstract:

With its arid climate, Qatar experiences low annual rainfall, intense storms, and high evaporation rates. However, the fast-paced rate of infrastructure development in the capital city of Doha has led to recurring instances of surface water flooding as well as rising groundwater levels. Public Work Authority (PWA/ASHGHAL) has implemented an approach to collect and discharge the flood water into a) positive gravity systems; b) Emergency Flooding Area (EFA) – Evaporation, Infiltration or Storage off-site using tankers; and c) Discharge to deep injection wells. As part of the flood prevention scheme, 21 deep injection wells have been constructed to discharge the collected surface and groundwater table in Doha city. These injection wells function as an alternative in localities that do not possess either positive gravity systems or downstream networks that can accommodate additional loads. These injection wells are 400-m deep and are constructed in a complex karstic subsurface condition with large cavities. The injection well system will discharge collected groundwater and storm surface runoff into the permeable Umm Er Radhuma Formation, which is an aquifer present throughout the Persian Gulf Region. The Umm Er Radhuma formation contains saline water that is not being used for water supply. The injection zone is separated by an impervious gypsum formation which acts as a barrier between upper and lower aquifer. State of the art drilling, grouting, and geophysical techniques have been implemented in construction of the wells to assure that the shallow aquifer would not be contaminated and impacted by injected water. Injection and pumping tests were performed to evaluate injection well functionality (injectability). The results of these tests indicated that majority of the wells can accept injection rate of 200 to 300 m3 /h (56 to 83 l/s) under gravity with average value of 250 m3 /h (70 l/s) compared to design value of 50 l/s. This paper presents design and construction process and issues associated with these injection wells, performing injection/pumping tests to determine capacity and effectiveness of the injection wells, the detailed design of collection system and conveying system into the injection wells, and the operation and maintenance process. This system is completed now and is under operation, and therefore, construction of injection wells is an effective option for flood control.

Keywords: deep injection well, flood prevention scheme, geophysical tests, pumping and injection tests, wellhead assembly

Procedia PDF Downloads 114
1330 Characterization of WNK2 Role on Glioma Cells Vesicular Traffic

Authors: Viviane A. O. Silva, Angela M. Costa, Glaucia N. M. Hajj, Ana Preto, Aline Tansini, Martin Roffé, Peter Jordan, Rui M. Reis

Abstract:

Autophagy is a recycling and degradative system suggested to be a major cell death pathway in cancer cells. Autophagy pathway is interconnected with the endocytosis pathways sharing the same ultimate lysosomal destination. Lysosomes are crucial regulators of cell homeostasis, responsible to downregulate receptor signalling and turnover. It seems highly likely that derailed endocytosis can make major contributions to several hallmarks of cancer. WNK2, a member of the WNK (with-no-lysine [K]) subfamily of protein kinases, had been found downregulated by its promoter hypermethylation, and has been proposed to act as a specific tumour-suppressor gene in brain tumors. Although some contradictory studies indicated WNK2 as an autophagy modulator, its role in cancer cell death is largely unknown. There is also growing evidence for additional roles of WNK kinases in vesicular traffic. Aim: To evaluate the role of WNK2 in autophagy and endocytosis on glioma context. Methods: Wild-type (wt) A172 cells (WNK2 promoter-methylated), and A172 transfected either with an empty vector (Ev) or with a WNK2 expression vector, were used to assess the cellular basal capacities to promote autophagy, through western blot and flow-cytometry analysis. Additionally, we evaluated the effect of WNK2 on general endocytosis trafficking routes by immunofluorescence. Results: The re-expression of ectopic WNK2 did not interfere with autophagy-related protein light chain 3 (LC3-II) expression levels as well as did not promote mTOR signaling pathway alteration when compared with Ev or wt A172 cells. However, the restoration of WNK2 resulted in a marked increase (8 to 92,4%) of Acidic Vesicular Organelles formation (AVOs). Moreover, our results also suggest that WNK2 cells promotes delay in uptake and internalization rate of cholera toxin B and transferrin ligands. Conclusions: The restoration of WNK2 interferes in vesicular traffic during endocytosis pathway and increase AVOs formation. This results also suggest the role of WNK2 in growth factor receptor turnover related to cell growth and homeostasis and associates one more time, WNK2 silencing contribution in genesis of gliomas.

Keywords: autophagy, endocytosis, glioma, WNK2

Procedia PDF Downloads 368
1329 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Primary Distant Metastases Growth

Authors: Ella Tyuryumina, Alexey Neznanov

Abstract:

Finding algorithms to predict the growth of tumors has piqued the interest of researchers ever since the early days of cancer research. A number of studies were carried out as an attempt to obtain reliable data on the natural history of breast cancer growth. Mathematical modeling can play a very important role in the prognosis of tumor process of breast cancer. However, mathematical models describe primary tumor growth and metastases growth separately. Consequently, we propose a mathematical growth model for primary tumor and primary metastases which may help to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoM-IV and corresponding software. We are interested in: 1) modelling the whole natural history of primary tumor and primary metastases; 2) developing adequate and precise CoM-IV which reflects relations between PT and MTS; 3) analyzing the CoM-IV scope of application; 4) implementing the model as a software tool. The CoM-IV is based on exponential tumor growth model and consists of a system of determinate nonlinear and linear equations; corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and primary metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for primary metastases; 3) ‘visible period’ for primary metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-IV model and predictive software: a) detect different growth periods of primary tumor and primary metastases; b) make forecast of the period of primary metastases appearance; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of BC and facilitate optimization of diagnostic tests. The following are calculated by CoM-IV: the number of doublings for ‘nonvisible’ and ‘visible’ growth period of primary metastases; tumor volume doubling time (days) for ‘nonvisible’ and ‘visible’ growth period of primary metastases. The CoM-IV enables, for the first time, to predict the whole natural history of primary tumor and primary metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-IV describes correctly primary tumor and primary distant metastases growth of IV (T1-4N0-3M1) stage with (N1-3) or without regional metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and manifestation of primary metastases.

Keywords: breast cancer, exponential growth model, mathematical modelling, primary metastases, primary tumor, survival

Procedia PDF Downloads 331
1328 Teachers Handbook: A Key to Imparting Teaching in Multilingual Classrooms at Kalinga Institute of Social Sciences (KISS)

Authors: Sushree Sangita Mohanty

Abstract:

The pedagogic system, which is used to work with indigenous groups, who have equally different socio-economic, socio-cultural & multi-lingual conditions with differing cognitive capabilities, makes the education situation complex. As a result, educating the indigenous people became just the dissemination of facts and information, but advancement in knowledge and possibilities somewhere hides. This gap arises complexities due to the language barrier and the teachers from a conventional background of teaching practices are unable to understand or connect with the students in the schools. This paper presents the research work of the Mother Tongue Based Multilingual Education (MTB-MLE) project that has developed a creative pedagogic endeavor for the students of Kalinga Institute of Social Sciences (KISS) for facilitating Multilingual Education (MLE) teaching. KISS is a home for 25,000 indigenous children. The students enrolled here are from 62 different indigenous communities who speak around 24 different languages with geographical articulation. The book contents include concept, understanding languages, similitudes among languages, the need of mother tongue in teaching and learning, skill development (Listening-Speaking-Reading-Writing), teachers activities for teaching in multilingual schools, the process of teaching, training format of multilingual teaching and procedures for basic data collection regarding multilingual schools and classroom handle.

Keywords: indigenous, multi-lingual, pedagogic, teachers, teaching practices

Procedia PDF Downloads 284
1327 A Systematic Review of Sensory Processing Patterns of Children with Autism Spectrum Disorders

Authors: Ala’a F. Jaber, Bara’ah A. Bsharat, Noor T. Ismael

Abstract:

Background: Sensory processing is a fundamental skill needed for the successful performance of daily living activities. These skills are impaired as parts of the neurodevelopmental process issues among children with autism spectrum disorder (ASD). This systematic review aimed to summarize the evidence on the differences in sensory processing and motor characteristic between children with ASD and children with TD. Method: This systematic review followed the guidelines of the preferred reporting items for systematic reviews and meta-analysis. The search terms included sensory, motor, condition, and child-related terms or phrases. The electronic search utilized Academic Search Ultimate, CINAHL Plus with Full Text, ERIC, MEDLINE, MEDLINE Complete, Psychology, and Behavioral Sciences Collection, and SocINDEX with full-text databases. The hand search included looking for potential studies in the references of related studies. The inclusion criteria included studies published in English between years 2009-2020 that included children aged 3-18 years with a confirmed ASD diagnosis, according to the DSM-V criteria, included a control group of typical children, included outcome measures related to the sensory processing and/or motor functions, and studies available in full-text. The review of included studies followed the Oxford Centre for Evidence-Based Medicine guidelines, and the Guidelines for Critical Review Form of Quantitative Studies, and the guidelines for conducting systematic reviews by the American Occupational Therapy Association. Results: Eighty-eight full-text studies related to the differences between children with ASD and children with TD in terms of sensory processing and motor characteristics were reviewed, of which eighteen articles were included in the quantitative synthesis. The results reveal that children with ASD had more extreme sensory processing patterns than children with TD, like hyper-responsiveness and hypo-responsiveness to sensory stimuli. Also, children with ASD had limited gross and fine motor abilities and lower strength, endurance, balance, eye-hand coordination, movement velocity, cadence, dexterity with a higher rate of gait abnormalities than children with TD. Conclusion: This systematic review provided preliminary evidence suggesting that motor functioning should be addressed in the evaluation and intervention for children with ASD, and sensory processing should be supported among children with TD. More future research should investigate whether how the performance and engagement in daily life activities are affected by sensory processing and motor skills.

Keywords: sensory processing, occupational therapy, children, motor skills

Procedia PDF Downloads 125
1326 Immunocytochemical Stability of Antigens in Cytological Samples Stored in In-house Liquid-Based Medium

Authors: Anamarija Kuhar, Veronika Kloboves Prevodnik, Nataša Nolde, Ulrika Klopčič

Abstract:

The decision for immunocytochemistry (ICC) is usually made in the basis of the findings in Giemsa- and/or Papanicolaou- smears. More demanding diagnostic cases require preparation of additional cytological preparations. Therefore, it is convenient to suspend cytological samples in a liquid based medium (LBM) that preserve antigen and morphological properties. However, the duration of these properties being preserved in the medium is usually unknown. Eventually, cell morphology becomes impaired and altered, as well as antigen properties may be lost or become diffused. In this study, the influence of cytological sample storage length in in-house liquid based medium on antigen properties and cell morphology is evaluated. The question is how long the cytological samples in this medium can be stored so that the results of immunocytochemical reactions are still reliable and can be safely used in routine cytopathological diagnostics. The stability of 6 ICC markers that are most frequently used in everyday routine work were tested; Cytokeratin AE1/AE3, Calretinin, Epithelial specific antigen Ep-CAM (MOC-31), CD 45, Oestrogen receptor (ER), and Melanoma triple cocktail were tested on methanol fixed cytospins prepared from fresh fine needle aspiration biopsies, effusion samples, and disintegrated lymph nodes suspended in in-house cell medium. Cytospins were prepared on the day of the sampling as well as on the second, fourth, fifth, and eight day after sample collection. Next, they were fixed in methanol and immunocytochemically stained. Finally, the percentage of positive stained cells, reaction intensity, counterstaining, and cell morphology were assessed using two assessment methods: the internal assessment and the UK NEQAS ICC scheme assessment. Results show that the antigen properties for Cytokeratin AE1/AE3, MOC-31, CD 45, ER, and Melanoma triple cocktail were preserved even after 8 days of storage in in-house LBM, while the antigen properties for Calretinin remained unchanged only for 4 days. The key parameters for assessing detection of antigen are the proportion of cells with a positive reaction and intensity of staining. Well preserved cell morphology is highly important for reliable interpretation of ICC reaction. Therefore, it would be valuable to perform a similar analysis for other ICC markers to determine the duration in which the antigen and morphological properties are preserved in LBM.

Keywords: cytology samples, cytospins, immunocytochemistry, liquid-based cytology

Procedia PDF Downloads 137
1325 Surge in U. S. Citizens Expatriation: Testing Structual Equation Modeling to Explain the Underlying Policy Rational

Authors: Marco Sewald

Abstract:

Comparing present to past the numbers of Americans expatriating U. S. citizenship have risen. Even though these numbers are small compared to the immigrants, U. S. citizens expatriations have historically been much lower, making the uptick worrisome. In addition, the published lists and numbers from the U.S. government seems incomplete, with many not counted. Different branches of the U. S. government report different numbers and no one seems to know exactly how big the real number is, even though the IRS and the FBI both track and/or publish numbers of Americans who renounce. Since there is no single explanation, anecdotal evidence suggests this uptick is caused by global tax law and increased compliance burdens imposed by the U.S. lawmakers on U.S. citizens abroad. Within a research project the question arose about the reasons why a constant growing number of U.S. citizens are expatriating – the answers are believed helping to explain the underlying governmental policy rational, leading to such activities. While it is impossible to locate former U.S. citizens to conduct a survey on the reasons and the U.S. government is not commenting on the reasons given within the process of expatriation, the chosen methodology is Structural Equation Modeling (SEM), in the first step by re-using current surveys conducted by different researchers within the population of U. S. citizens residing abroad during the last years. Surveys questioning the personal situation in the context of tax, compliance, citizenship and likelihood to repatriate to the U. S. In general SEM allows: (1) Representing, estimating and validating a theoretical model with linear (unidirectional or not) relationships. (2) Modeling causal relationships between multiple predictors (exogenous) and multiple dependent variables (endogenous). (3) Including unobservable latent variables. (4) Modeling measurement error: the degree to which observable variables describe latent variables. Moreover SEM seems very appealing since the results can be represented either by matrix equations or graphically. Results: the observed variables (items) of the construct are caused by various latent variables. The given surveys delivered a high correlation and it is therefore impossible to identify the distinct effect of each indicator on the latent variable – which was one desired result. Since every SEM comprises two parts: (1) measurement model (outer model) and (2) structural model (inner model), it seems necessary to extend the given data by conducting additional research and surveys to validate the outer model to gain the desired results.

Keywords: expatriation of U. S. citizens, SEM, structural equation modeling, validating

Procedia PDF Downloads 216
1324 Introducing a Video-Based E-Learning Module to Improve Disaster Preparedness at a Tertiary Hospital in Oman

Authors: Ahmed Al Khamisi

Abstract:

The Disaster Preparedness Standard (DPS) is one of the elements that is evaluated by the Accreditation Canada International (ACI). ACI emphasizes to train and educate all staff, including service providers and senior leaders, on emergency and disaster preparedness upon the orientation and annually thereafter. Lack of awareness and deficit of knowledge among the healthcare providers about DPS have been noticed in a tertiary hospital where ACI standards were implemented. Therefore, this paper aims to introduce a video-based e-learning (VB-EL) module that explains the hospital’s disaster plan in a simple language which will be easily accessible to all healthcare providers through the hospital’s website. The healthcare disaster preparedness coordinator in the targeted hospital will be responsible to ensure that VB-EL is ready by 25 April 2019. This module will be developed based on the Kirkpatrick evaluation method. In fact, VB-EL combines different data forms such as images, motion, sounds, text in a complementary fashion which will suit diverse learning styles and individual learning pace of healthcare providers. Moreover, the module can be adjusted easily than other tools to control the information that healthcare providers receive. It will enable healthcare providers to stop, rewind, fast-forward, and replay content as many times as needed. Some anticipated limitations in the development of this module include challenges of preparing VB-EL content and resistance from healthcare providers.

Keywords: Accreditation Canada International, Disaster Preparedness Standard, Kirkpatrick evaluation method, video-based e-learning

Procedia PDF Downloads 144
1323 Post-Traumatic Stress Disorder and Problem Alcohol Use in Women: Systematic Analysis

Authors: Neringa Bagdonaite

Abstract:

Study Aims: The current study aimed to systematically analyse various research done in the area of female post-traumatic stress disorder (PTSD) and alcohol abuse, and to critically review these results on the basis of theoretical models as well as answer following questions: (I) What is the reciprocal relationship between PTSD and problem alcohol use among females; (II) What are the moderating/mediating factors of this relationship? Methods: The computer bibliographic databases Ebsco, Scopus, Springer, Web of Science, Medline, Science Direct were used to search for scientific articles. Systematic analyses sample consisted of peer-reviewed, English written articles addressing mixed gender and female PTSD and alcohol abuse issues from Jan 2012 to May 2017. Results: Total of 1011 articles were found in scientific databases related to searched keywords of which 29 met the selection criteria and were analysed. The results of longitudinal studies indicate that (I) various trauma, especially interpersonal trauma exposure in childhood is linked with increased risk of revictimization in later life and problem alcohol use; (II) revictimization in adolescence or adulthood, rather than victimization in childhood has a greater impact on the onset and progression of problematic alcohol use in adulthood. Cross-sectional and epidemiological studies also support significant relationships between female PTSD and problem alcohol use. Regards to the negative impact of alcohol use on PTSD symptoms results are yet controversial; some evidence suggests that alcohol does not exacerbate symptoms of PTSD over time, while others argue that problem alcohol use worsens PTSD symptoms and is linked to chronicity of both disorders, especially among women with previous alcohol use problems. Analysis of moderating/mediating factors of PTSD and problem alcohol use revealed, that higher motives/expectancies, specifically distress coping motives for alcohol use significantly moderates the relationship between PTSD and problematic alcohol use. Whereas negative affective states mediate relationship between symptoms of PTSD and alcohol use, but only among woman with alcohol use problems already developed. Conclusions: Interpersonal trauma experience, especially in childhood and its reappearance in lifetime is linked with PTSD symptoms and problem drinking among women. Moreover, problem alcohol use can be both a cause and a consequence of trauma and PTSD, and if used for coping it, increases the likelihood of chronicity of both disorders. In order to effectively treat both disorders, it’s worthwhile taking into account this dynamic interplay of women's PTSD symptoms and problem drinking.

Keywords: female, trauma, post-traumatic stress disorder, problem alcohol use, systemic analysis

Procedia PDF Downloads 180
1322 Comparison of EMG Normalization Techniques Recommended for Back Muscles Used in Ergonomics Research

Authors: Saif Al-Qaisi, Alif Saba

Abstract:

Normalization of electromyography (EMG) data in ergonomics research is a prerequisite for interpreting the data. Normalizing accounts for variability in the data due to differences in participants’ physical characteristics, electrode placement protocols, time of day, and other nuisance factors. Typically, normalized data is reported as a percentage of the muscle’s isometric maximum voluntary contraction (%MVC). Various MVC techniques have been recommended in the literature for normalizing EMG activity of back muscles. This research tests and compares the recommended MVC techniques in the literature for three back muscles commonly used in ergonomics research, which are the lumbar erector spinae (LES), latissimus dorsi (LD), and thoracic erector spinae (TES). Six healthy males from a university population participated in this research. Five different MVC exercises were compared for each muscle using the Tringo wireless EMG system (Delsys Inc.). Since the LES and TES share similar functions in controlling trunk movements, their MVC exercises were the same, which included trunk extension at -60°, trunk extension at 0°, trunk extension while standing, hip extension, and the arch test. The MVC exercises identified in the literature for the LD were chest-supported shoulder extension, prone shoulder extension, lat-pull down, internal shoulder rotation, and abducted shoulder flexion. The maximum EMG signal was recorded during each MVC trial, and then the averages were computed across participants. A one-way analysis of variance (ANOVA) was utilized to determine the effect of MVC technique on muscle activity. Post-hoc analyses were performed using the Tukey test. The MVC technique effect was statistically significant for each of the muscles (p < 0.05); however, a larger sample of participants was needed to detect significant differences in the Tukey tests. The arch test was associated with the highest EMG average at the LES, and also it resulted in the maximum EMG activity more often than the other techniques (three out of six participants). For the TES, trunk extension at 0° was associated with the largest EMG average, and it resulted in the maximum EMG activity the most often (three out of six participants). For the LD, participants obtained their maximum EMG either from chest-supported shoulder extension (three out of six participants) or prone shoulder extension (three out of six participants). Chest-supported shoulder extension, however, had a larger average than prone shoulder extension (0.263 and 0.240, respectively). Although all the aforementioned techniques were superior in their averages, they did not always result in the maximum EMG activity. If an accurate estimate of the true MVC is desired, more than one technique may have to be performed. This research provides additional MVC techniques for each muscle that may elicit the maximum EMG activity.

Keywords: electromyography, maximum voluntary contraction, normalization, physical ergonomics

Procedia PDF Downloads 189
1321 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 280
1320 A Memristive Device with Intrinsic Rectification Behavior and Performace of Crossbar Arrays

Authors: Yansong Gao, Damith C.Ranasinghe, Siad F. Al-Sarawi, Omid Kavehei, Derek Abbott

Abstract:

Passive crossbar arrays is in principle the simplest functional electrical circuit, together with memristive device in cross-point, holding great promise in future high-density, non-volatile memories. However, the greatest problem of crossbar array is the sneak path current. In this paper, we investigate one type of memristive device with intrinsic rectification behavior to address the sneak path currents. Firstly, a SPICE behavior model written in Verilog-A language of the memristive device is presented to fit experimental data published in literature. Next, systematic performance simulations including read margin and power consumption of crossbar array, which uses the self-rectifying memristive device as storage element at cross-point, with respect to different crossbar sizes, interconnect resistance, ratio of HRS/LRS (High Resistance State/ Low Resistance State), rectification ratio and different read schemes are conducted. Subsequently, Trade-offs among reading margin, power consumption, and reading schemes are analyzed to provide guidelines for circuit design. Finally, performance comparison between the memristive device with/without intrinsic rectification behavior is given to show the worthiness of this intrinsic rectification behavior.

Keywords: memristive device, memristor, crossbar, RRAM, read margin, power consumption

Procedia PDF Downloads 434
1319 Antigen Stasis can Predispose Primary Ciliary Dyskinesia (PCD) Patients to Asthma

Authors: Nadzeya Marozkina, Joe Zein, Benjamin Gaston

Abstract:

Introduction: We have observed that many patients with Primary Ciliary Dyskinesia (PCD) benefit from asthma medications. In healthy airways, the ciliary function is normal. Antigens and irritants are rapidly cleared, and NO enters the gas phase normally to be exhaled. In the PCD airways, however, antigens, such as Dermatophagoides, are not as well cleared. This defect leads to oxidative stress, marked by increased DUOX1 expression and decreased superoxide dismutase [SOD] activity (manuscript under revision). H₂O₂, in high concentrations in the PCD airway, injures the airway. NO is oxidized rather than being exhaled, forming cytotoxic peroxynitrous acid. Thus, antigen stasis on PCD airway epithelium leads to airway injury and may predispose PCD patients to asthma. Indeed, recent population genetics suggest that PCD genes may be associated with asthma. We therefore hypothesized that PCD patients would be predisposed to having asthma. Methods. We analyzed our database of 18 million individual electronic medical records (EMRs) in the Indiana Network for Patient Care research database (INPCR). There is not an ICD10 code for PCD itself; code Q34.8 is most commonly used clinically. To validate analysis of this code, we queried patients who had an ICD10 code for both bronchiectasis and situs inversus totalis in INPCR. We also studied a validation cohort using the IBM Explorys® database (over 80 million individuals). Analyses were adjusted for age, sex and race using a 1 PCD: 3 controls matching method in INPCR and multivariable logistic regression in the IBM Explorys® database. Results. The prevalence of asthma ICD10 codes in subjects with a code Q34.8 was 67% vs 19% in controls (P < 0.0001) (Regenstrief Institute). Similarly, in IBM*Explorys, the OR [95% CI] for having asthma if a patient also had ICD10 code 34.8, relative to controls, was =4.04 [3.99; 4.09]. For situs inversus alone the OR [95% CI] was 4.42 [4.14; 4.71]; and bronchiectasis alone the OR [95% CI] =10.68 (10.56; 10.79). For both bronchiectasis and situs inversus together, the OR [95% CI] =28.80 (23.17; 35.81). Conclusions: PCD causes antigen stasis in the human airway (under review), likely predisposing to asthma in addition to oxidative and nitrosative stress and to airway injury. Here, we show that, by several different population-based metrics, and using two large databases, patients with PCD appear to have between a three- and 28-fold increased risk of having asthma. These data suggest that additional studies should be undertaken to understand the role of ciliary dysfunction in the pathogenesis and genetics of asthma. Decreased antigen clearance caused by ciliary dysfunction may be a risk factor for asthma development.

Keywords: antigen, PCD, asthma, nitric oxide

Procedia PDF Downloads 97
1318 Land Cover Mapping Using Sentinel-2, Landsat-8 Satellite Images, and Google Earth Engine: A Study Case of the Beterou Catchment

Authors: Ella Sèdé Maforikan

Abstract:

Accurate land cover mapping is essential for effective environmental monitoring and natural resources management. This study focuses on assessing the classification performance of two satellite datasets and evaluating the impact of different input feature combinations on classification accuracy in the Beterou catchment, situated in the northern part of Benin. Landsat-8 and Sentinel-2 images from June 1, 2020, to March 31, 2021, were utilized. Employing the Random Forest (RF) algorithm on Google Earth Engine (GEE), a supervised classification categorized the land into five classes: forest, savannas, cropland, settlement, and water bodies. GEE was chosen due to its high-performance computing capabilities, mitigating computational burdens associated with traditional land cover classification methods. By eliminating the need for individual satellite image downloads and providing access to an extensive archive of remote sensing data, GEE facilitated efficient model training on remote sensing data. The study achieved commendable overall accuracy (OA), ranging from 84% to 85%, even without incorporating spectral indices and terrain metrics into the model. Notably, the inclusion of additional input sources, specifically terrain features like slope and elevation, enhanced classification accuracy. The highest accuracy was achieved with Sentinel-2 (OA = 91%, Kappa = 0.88), slightly surpassing Landsat-8 (OA = 90%, Kappa = 0.87). This underscores the significance of combining diverse input sources for optimal accuracy in land cover mapping. The methodology presented herein not only enables the creation of precise, expeditious land cover maps but also demonstrates the prowess of cloud computing through GEE for large-scale land cover mapping with remarkable accuracy. The study emphasizes the synergy of different input sources to achieve superior accuracy. As a future recommendation, the application of Light Detection and Ranging (LiDAR) technology is proposed to enhance vegetation type differentiation in the Beterou catchment. Additionally, a cross-comparison between Sentinel-2 and Landsat-8 for assessing long-term land cover changes is suggested.

Keywords: land cover mapping, Google Earth Engine, random forest, Beterou catchment

Procedia PDF Downloads 58
1317 Talking to Ex-Islamic State Fighters inside Iraqi Prisons: An Arab Woman’s Perspective on Radicalization and Deradicalization

Authors: Suha Hassen

Abstract:

This research aims to untangle the complexity of conducting face-to-face interviews with 80 ex-Islamic State fighters, encompassing three groups: local Iraqis, Arabs from the Middle East, and international fighters from around the globe. Each interview lasted approximately two hours and was conducted in both Arabic and English, focusing on the motivations behind joining the Islamic State and the pathways and mechanisms facilitating their involvement. The phenomenon of individuals joining violent Islamist extremist and jihadist organizations is multifaceted, drawing substantial attention within terrorism and security studies. Organizations such as the Islamic State, Hezbollah, Hamas, and Al-Qaeda pose formidable threats to international peace and stability, employing various terrorist tactics for radicalization and recruitment. However, significant gaps remain in current studies, including a lack of firsthand accounts, an inadequate understanding of original narratives (religious and linguistic) due to abstraction and misinterpretation of motivations, and a lack of Arab women's perspectives from the region. This study addresses these gaps by exploring the cultural, religious, and historical complexities that shape the narratives of ex-ISIS fighters. The paper will showcase three distinct cases: one French prisoner, one Moroccan fighter, and a local Iraqi, illustrating the diverse motivations and experiences that contribute to joining and leaving extremist groups. The findings provide valuable insights into the nuanced dynamics of radicalization, emphasizing the need for gender-sensitive approaches in counter-terrorism strategies and deradicalization programs. Importantly, this research has practical implications for counter-narrative policies and early-stage prevention of radicalization. By understanding the narratives used by ex-fighters, policymakers can develop targeted counter-narratives that disrupt recruitment efforts. Additionally, insights into the mechanisms of radicalization can inform early intervention programs, helping to identify and support at-risk individuals before they become entrenched in extremist ideologies. Ultimately, this research enhances our understanding of the individual experiences of ex-ISIS fighters and calls for a reevaluation of the narratives surrounding women’s roles in extremism and recovery.

Keywords: Arab women in extremism, counter-narrative policy, ex-ISIS fighters in Iraq, radicalization

Procedia PDF Downloads 7
1316 In vitro and in vivo Infectivity of Coxiella burnetii Strains from French Livestock

Authors: Joulié Aurélien, Jourdain Elsa, Bailly Xavier, Gasqui Patrick, Yang Elise, Leblond Agnès, Rousset Elodie, Sidi-Boumedine Karim

Abstract:

Q fever is a worldwide zoonosis caused by the gram-negative obligate intracellular bacterium Coxiella burnetii. Following the recent outbreaks in the Netherlands, a hyper virulent clone was found to be the cause of severe human cases of Q fever. In livestock, Q fever clinical manifestations are mainly abortions. Although the abortion rates differ between ruminant species, C. burnetii’s virulence remains understudied, especially in enzootic areas. In this study, the infectious potential of three C. burnetii isolates collected from French farms of small ruminants were compared to the reference strain Nine Mile (in phase II and in an intermediate phase) using an in vivo (CD1 mice) model. Mice were challenged with 105 live bacteria discriminated by propidium monoazide-qPCR targeting the icd-gene. After footpad inoculation, spleen and popliteal lymph node were harvested at 10 days post-inoculation (p.i). The strain invasiveness in spleen and popliteal nodes was assessed by qPCR assays targeting the icd-gene. Preliminary results showed that the avirulent strains (in phase 2) failed to pass the popliteal barrier and then to colonize the spleen. This model allowed a significant differentiation between strain’s invasiveness on biological host and therefore identifying distinct virulence profiles. In view of these results, we plan to go further by testing fifteen additional C. burnetii isolates from French farms of sheep, goat and cattle by using the above-mentioned in vivo model. All 15 strains display distant MLVA (multiple-locus variable-number of tandem repeat analysis) genotypic profiles. Five of the fifteen isolates will bee also tested in vitro on ovine and bovine macrophage cells. Cells and supernatants will be harvested at day1, day2, day3 and day6 p.i to assess in vitro multiplication kinetics of strains. In conclusion, our findings might help the implementation of surveillance of virulent strains and ultimately allow adapting prophylaxis measures in livestock farms.

Keywords: Q fever, invasiveness, ruminant, virulence

Procedia PDF Downloads 357
1315 Towards Modern Approaches of Intelligence Measurement for Clinical and Educational Practices

Authors: Alena Kulikova, Tatjana Kanonire

Abstract:

Intelligence research is one of the oldest fields of psychology. Many factors have made a research on intelligence, defined as reasoning and problem solving [1, 2], a very acute and urgent problem. Thus, it has been repeatedly shown that intelligence is a predictor of academic, professional, and social achievement in adulthood (for example, [3]); Moreover, intelligence predicts these achievements better than any other trait or ability [4]. The individual level, a comprehensive assessment of intelligence is a necessary criterion for the diagnosis of various mental conditions. For example, it is a necessary condition for psychological, medical and pedagogical commissions when deciding on educational needs and the most appropriate educational programs for school children. Assessment of intelligence is crucial in clinical psychodiagnostic and needs high-quality intelligence measurement tools. Therefore, it is not surprising that the development of intelligence tests is an essential part of psychological science and practice. Many modern intelligence tests have a long history and have been used for decades, for example, the Stanford-Binet test or the Wechsler test. However, the vast majority of these tests are based on the classic linear test structure, in which all respondents receive all tasks (see, for example, a critical review by [5]). This understanding of the testing procedure is a legacy of the pre-computer era, in which blank testing was the only diagnostic procedure available [6] and has some significant limitations that affect the reliability of the data obtained [7] and increased time costs. Another problem with measuring IQ is that classical line-structured tests do not fully allow to measure respondent's intellectual progress [8], which is undoubtedly a critical limitation. Advances in modern psychometrics allow for avoiding the limitations of existing tools. However, as in any rapidly developing industry, at the moment, psychometrics does not offer ready-made and straightforward solutions and requires additional research. In our presentation we would like to discuss the strengths and weaknesses of the current approaches to intelligence measurement and highlight “points of growth” for creating a test in accordance with modern psychometrics. Whether it is possible to create the instrument that will use all achievements of modern psychometric and remain valid and practically oriented. What would be the possible limitations for such an instrument? The theoretical framework and study design to create and validate the original Russian comprehensive computer test for measuring the intellectual development in school-age children will be presented.

Keywords: Intelligence, psychometrics, psychological measurement, computerized adaptive testing, multistage testing

Procedia PDF Downloads 74
1314 Machiavellian Language at Work: The Signs of Machiavellianism in Work-Related Interviews

Authors: Gyongyver Csapo, Andrea Czibor

Abstract:

Machiavellianism is a personality trait based on the exploitation and deception of others. Machiavellian individuals are motivated to gain and to maintain power with the help of their strategic thinking, manipulation tactics, and interpersonal skills. Consequently, Machiavellianism is treated as a personality trait that can affect an individual’s career and work-related behavior. The aim of our research is to provide a narrative psychological approach to Machiavellianism in order to get a more comprehensive picture about the attitudes, values, and work-related behaviors of Machiavellian individuals. In this study, semi-structured interviews were made with employees (N=275) about their work-related experiences. Additionally, participants completed questionnaires about their turnover intention and perceived stress. The interviews were examined with narrative psychological content analysis and thematic analyzes. Based on the thematic analysis, mentioning of two topics (recognition at work and control) were associated with Machiavellianism. Scientific narrative psychological content analysis showed a negative association between Machiavellianism and positive emotions. Turnover intention and the magnitude of perceived work-related stress showed a significant positive correlation with Machiavellianism. In this study, qualitative and quantitative methodologies were combined in order to get a deeper insight of Machiavellianism from an organizational psychological perspective. Our research can contribute to a better understanding of this personality trait and provides an excellent basis for further investigations.

Keywords: machiavellianism, narrative psychology, turnover intention, work-related stress

Procedia PDF Downloads 128
1313 Storm-Runoff Simulation Approaches for External Natural Catchments of Urban Sewer Systems

Authors: Joachim F. Sartor

Abstract:

According to German guidelines, external natural catchments are greater sub-catchments without significant portions of impervious areas, which possess a surface drainage system and empty in a sewer network. Basically, such catchments should be disconnected from sewer networks, particularly from combined systems. If this is not possible due to local conditions, their flow hydrographs have to be considered at the design of sewer systems, because the impact may be significant. Since there is a lack of sufficient measurements of storm-runoff events for such catchments and hence verified simulation methods to analyze their design flows, German standards give only general advices and demands special considerations in such cases. Compared to urban sub-catchments, external natural catchments exhibit greatly different flow characteristics. With increasing area size their hydrological behavior approximates that of rural catchments, e.g. sub-surface flow may prevail and lag times are comparable long. There are few observed peak flow values and simple (mostly empirical) approaches that are offered by literature for Central Europe. Most of them are at least helpful to crosscheck results that are achieved by simulation lacking calibration. Using storm-runoff data from five monitored rural watersheds in the west of Germany with catchment areas between 0.33 and 1.07 km2 , the author investigated by multiple event simulation three different approaches to determine the rainfall excess. These are the modified SCS variable run-off coefficient methods by Lutz and Zaiß as well as the soil moisture model by Ostrowski. Selection criteria for storm events from continuous precipitation data were taken from recommendations of M 165 and the runoff concentration method (parallel cascades of linear reservoirs) from a DWA working report to which the author had contributed. In general, the two run-off coefficient methods showed results that are of sufficient accuracy for most practical purposes. The soil moisture model showed no significant better results, at least not to such a degree that it would justify the additional data collection that its parameter determination requires. Particularly typical convective summer events after long dry periods, that are often decisive for sewer networks (not so much for rivers), showed discrepancies between simulated and measured flow hydrographs.

Keywords: external natural catchments, sewer network design, storm-runoff modelling, urban drainage

Procedia PDF Downloads 148
1312 Choice Analysis of Ground Access to São Paulo/Guarulhos International Airport Using Adaptive Choice-Based Conjoint Analysis (ACBC)

Authors: Carolina Silva Ansélmo

Abstract:

Airports are demand-generating poles that affect the flow of traffic around them. The airport access system must be fast, convenient, and adequately planned, considering its potential users. An airport with good ground access conditions can provide the user with a more satisfactory access experience. When several transport options are available, service providers must understand users' preferences and the expected quality of service. The present study focuses on airport access in a comparative scenario between bus, private vehicle, subway, taxi and urban mobility transport applications to São Paulo/Guarulhos International Airport. The objectives are (i) to identify the factors that influence the choice, (ii) to measure Willingness to Pay (WTP), and (iii) to estimate the market share for each modal. The applied method was Adaptive Choice-based Conjoint Analysis (ACBC) technique using Sawtooth Software. Conjoint analysis, rooted in Utility Theory, is a survey technique that quantifies the customer's perceived utility when choosing alternatives. Assessing user preferences provides insights into their priorities for product or service attributes. An additional advantage of conjoint analysis is its requirement for a smaller sample size compared to other methods. Furthermore, ACBC provides valuable insights into consumers' preferences, willingness to pay, and market dynamics, aiding strategic decision-making to provide a better customer experience, pricing, and market segmentation. In the present research, the ACBC questionnaire had the following variables: (i) access time to the boarding point, (ii) comfort in the vehicle, (iii) number of travelers together, (iv) price, (v) supply power, and (vi) type of vehicle. The case study questionnaire reached 213 valid responses considering the scenario of access from the São Paulo city center to São Paulo/Guarulhos International Airport. As a result, the price and the number of travelers are the most relevant attributes for the sample when choosing airport access. The market share of the selection is mainly urban mobility transport applications, followed by buses, private vehicles, taxis and subways.

Keywords: adaptive choice-based conjoint analysis, ground access to airport, market share, willingness to pay

Procedia PDF Downloads 71
1311 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 105
1310 Music in the Early Stages of Life: Considerations from Working with Groups of Mothers and Babies

Authors: Ana Paula Melchiors Stahlschmidt

Abstract:

This paper discusses the role of music as a ludic activity and constituent element of voice in the construction and consolidation of the relationship of the baby and his/her mother or caretaker, evaluating its implications in his/her psychic structure and constitution as a subject. The work was based on the research developed as part of the author’s doctoral activities carried out from her insertion in a project of the Music Department of Federal University of Rio Grande do Sul - UFRGS, which objective was the development of musical activities with groups of babies from 0 to 24 months old and their caretakers. Observations, video recordings of the meetings, audio testemonies, and evaluation tools applied to group participants were used as instruments for this research. Information was collected on the participation of 195 babies, among which 8 were more focused on through interviews with their mothers or caretakers. These interviews were analyzed based on the referential of French Discourse Analysis, Psychoanalysis, Psychology of Development and Musical Education. The results of the research were complemented by other posterior experiences that the author developed with similar groups, in a context of a private clinic. The information collected allowed the observation of the ludic and structural functions of musical activities, when developed in a structured environment, as well as the importance of the musicality of the mother’s voice to the psychical structuring of the baby, allowing his/her insertion in the language and his/her constituition as a subject.

Keywords: music and babies, maternal voice, Psychoanalysis and music, psychology and music

Procedia PDF Downloads 448