Search results for: Clayton Haske
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15

Search results for: Clayton Haske

15 Bivariate Time-to-Event Analysis with Copula-Based Cox Regression

Authors: Duhania O. Mahara, Santi W. Purnami, Aulia N. Fitria, Merissa N. Z. Wirontono, Revina Musfiroh, Shofi Andari, Sagiran Sagiran, Estiana Khoirunnisa, Wahyudi Widada

Abstract:

For assessing interventions in numerous disease areas, the use of multiple time-to-event outcomes is common. An individual might experience two different events called bivariate time-to-event data, the events may be correlated because it come from the same subject and also influenced by individual characteristics. The bivariate time-to-event case can be applied by copula-based bivariate Cox survival model, using the Clayton and Frank copulas to analyze the dependence structure of each event and also the covariates effect. By applying this method to modeling the recurrent event infection of hemodialysis insertion on chronic kidney disease (CKD) patients, from the AIC and BIC values we find that the Clayton copula model was the best model with Kendall’s Tau is (τ=0,02).

Keywords: bivariate cox, bivariate event, copula function, survival copula

Procedia PDF Downloads 82
14 Stroke Rehabilitation via Electroencephalogram Sensors and an Articulated Robot

Authors: Winncy Du, Jeremy Nguyen, Harpinder Dhillon, Reinardus Justin Halim, Clayton Haske, Trent Hughes, Marissa Ortiz, Rozy Saini

Abstract:

Stroke often causes death or cerebro-vascular (CV) brain damage. Most patients with CV brain damage lost their motor control on their limbs. This paper focuses on developing a reliable, safe, and non-invasive EEG-based robot-assistant stroke rehabilitation system to help stroke survivors to rapidly restore their motor control functions for their limbs. An electroencephalogram (EEG) recording device (EPOC Headset) and was used to detect a patient’s brain activities. The EEG signals were then processed, classified, and interpreted to the motion intentions, and then converted to a series of robot motion commands. A six-axis articulated robot (AdeptSix 300) was employed to provide the intended motions based on these commends. To ensure the EEG device, the computer, and the robot can communicate to each other, an Arduino microcontroller is used to physically execute the programming codes to a series output pins’ status (HIGH or LOW). Then these “hardware” commends were sent to a 24 V relay to trigger the robot’s motion. A lookup table for various motion intensions and the associated EEG signal patterns were created (through training) and installed in the microcontroller. Thus, the motion intention can be direct determined by comparing the EEG patterns obtaibed from the patient with the look-up table’s EEG patterns; and the corresponding motion commends are sent to the robot to provide the intended motion without going through feature extraction and interpretation each time (a time-consuming process). For safety sake, an extender was designed and attached to the robot’s end effector to ensure the patient is beyond the robot’s workspace. The gripper is also designed to hold the patient’s limb. The test results of this rehabilitation system show that it can accurately interpret the patient’s motion intension and move the patient’s arm to the intended position.

Keywords: brain waves, EEG sensor, motion control, robot-assistant stroke rehabilitation

Procedia PDF Downloads 383
13 Innovating and Disrupting Higher Education: The Evolution of Massive Open Online Courses

Authors: Nabil Sultan

Abstract:

A great deal has been written on Massive Open Online Courses (MOOCs) since 2012 (considered by some as the year of the MOOCs). The emergence of MOOCs caused a great deal of interest amongst academics and technology experts as well as ordinary people. Some of the authors who wrote on MOOCs perceived it as the next big thing that will disrupt education. Other authors saw it as another fad that will go away once it ran its course (as most fads often do). But MOOCs did not turn out to be a fad and it is still around. Most importantly, they evolved into something that is beginning to look like a viable business model. This paper explores this phenomenon within the theoretical frameworks of disruptive innovations and jobs to be done as developed by Clayton Christensen and his colleagues and its implications for the future of higher education (HE).

Keywords: MOOCs, disruptive innovations, higher education, jobs theory

Procedia PDF Downloads 270
12 An Investigation into the Isolation and Bandwidth Characteristics of X-Band Chireix Power Amplifier Combiners

Authors: Daniel P. Clayton, Edward A. Ball

Abstract:

This paper describes an investigation into the isolation characteristics and bandwidth performance of RF combiners that are used as part of Chireix PA architectures, designed for use in the X-Band range of frequencies. Combiner designs investigated are the typical Chireix and Wilkinson configurations which also include simulation of the Wilkinson using manufacturer’s data for the isolation resistor. Another simulation was the less common approach of using a Branchline coupler to form the combiner, as well as simulation results from adding an additional stage. This paper presents the findings of this investigation and compares the bandwidth performance and isolation characteristics to determine suitability.

Keywords: bandwidth, Chireix, couplers, outphasing, power amplifiers, Wilkinson, X-Band

Procedia PDF Downloads 257
11 Psychometric Properties and Factor Structure of the College Readiness Questionnaire

Authors: Muna Al-Kalbani, Thuwayba Al Barwani, Otherine Neisler, Hussain Alkharusi, David Clayton, Humaira Al-Sulaimani, Mohammad Khan, Hamad Al-Yahmadi

Abstract:

This study describes the psychometric properties and factor structure of the University Readiness Survey (URS). Survey data were collected from sample of 2652 students from Sultan Qaboos University. Exploratory factor analysis identified ten significant factors underlining the structure. The results of Confirmatory factor analysis showed a good fit to the data where the indices for the revised model were χ2(df = 1669) = 6093.4; CFI = 0.900; GFI =0.926; PCLOSE = 1.00 and RMSAE = 0.030 where each of these indices were above threshold. The overall value of Cronbach’s alpha was 0.899 indicating that the instrument score was reliable. Results imply that the URS is a valid measure describing the college readiness pattern among Sultan Qaboos University students and the Arabic version could be used by university counselors to identify students’ readiness factors. Nevertheless, further validation of the of the USR is recommended.

Keywords: college readiness, confirmatory factor analysis, reliability, validity

Procedia PDF Downloads 226
10 Local Radial Basis Functions for Helmholtz Equation in Seismic Inversion

Authors: Hebert Montegranario, Mauricio Londoño

Abstract:

Solutions of Helmholtz equation are essential in seismic imaging methods like full wave inversion, which needs to solve many times the wave equation. Traditional methods like Finite Element Method (FEM) or Finite Differences (FD) have sparse matrices but may suffer the so called pollution effect in the numerical solutions of Helmholtz equation for large values of the wave number. On the other side, global radial basis functions have a better accuracy but produce full matrices that become unstable. In this research we combine the virtues of both approaches to find numerical solutions of Helmholtz equation, by applying a meshless method that produce sparse matrices by local radial basis functions. We solve the equation with absorbing boundary conditions of the kind Clayton-Enquist and PML (Perfect Matched Layers) and compared with results in standard literature, showing a promising performance by tackling both the pollution effect and matrix instability.

Keywords: Helmholtz equation, meshless methods, seismic imaging, wavefield inversion

Procedia PDF Downloads 547
9 The Low-Cost Design and 3D Printing of Structural Knee Orthotics for Athletic Knee Injury Patients

Authors: Alexander Hendricks, Sean Nevin, Clayton Wikoff, Melissa Dougherty, Jacob Orlita, Rafiqul Noorani

Abstract:

Knee orthotics play an important role in aiding in the recovery of those with knee injuries, especially athletes. However, structural knee orthotics is often very expensive, ranging between $300 and $800. The primary reason for this project was to answer the question: can 3D printed orthotics represent a viable and cost-effective alternative to present structural knee orthotics? The primary objective for this research project was to design a knee orthotic for athletes with knee injuries for a low-cost under $100 and evaluate its effectiveness. The initial design for the orthotic was done in SolidWorks, a computer-aided design (CAD) software available at Loyola Marymount University. After this design was completed, finite element analysis (FEA) was utilized to understand how normal stresses placed upon the knee affected the orthotic. The knee orthotic was then adjusted and redesigned to meet a specified factor-of-safety of 3.25 based on the data gathered during FEA and literature sources. Once the FEA was completed and the orthotic was redesigned based from the data gathered, the next step was to move on to 3D-printing the first design of the knee brace. Subsequently, physical therapy movement trials were used to evaluate physical performance. Using the data from these movement trials, the CAD design of the brace was refined to accommodate the design requirements. The final goal of this research means to explore the possibility of replacing high-cost, outsourced knee orthotics with a readily available low-cost alternative.

Keywords: 3D printing, knee orthotics, finite element analysis, design for additive manufacturing

Procedia PDF Downloads 181
8 Re-Envisioning Modernity: Transformations of Postwar Suburban Landscapes

Authors: Shannon Clayton

Abstract:

In an effort to explore the potential transformation of North American postwar suburbs, this M.Arch thesis actively engages in the ongoing critique of modernism from the mid 20th century to the present. Contemporary urban design practice has emerged out of the reaction to orthodox modernism. Typically, new suburban development falls into one of two strategies; an attempt to replicate pre-war fabric that never existed, or a reliance on high-density to create instant urbanism. In both cases, the critical role of architecture has been grossly undervalued. Ironically, it is the denial of suburbia’s inherent modernity that has served to prevent genuine place-making. As history demonstrates, modernism is not antithetical to architecture and place. In the postwar years, a critical discussion emerged amongst architects, which sought to evolve modernism beyond functionalism. This was demonstrated through critical discussions on image, experience, and monumentality. As well as increased interest in civic space, and investigations into mat urbanism and the megastructure. The undercurrent within these explorations was a belief that the scale and complexity of modern development could become an opportunity to create urbanism, rather than squander it. This critical discourse has continued through architectural work in the Netherlands and Denmark since the early 1990s, where an emphasis on visual variety, human scale, and public interaction has been given high priority. This thesis applies principles from this ongoing dialogue, and identifies hidden potential within existing North American suburban networks. As a result, the project re-evaluates the legacy of the master plan from a contemporary perspective.

Keywords: urbanism, modernism, suburbia, place-making

Procedia PDF Downloads 252
7 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 143
6 Enabling Translanguaging in the EFL Classroom, Affordances of Learning and Reflections

Authors: Nada Alghali

Abstract:

Translanguaging pedagogy suggests a new perspective in language education relating to multilingualism; multilingual learners have one linguistic repertoire and not two or more separate language systems (García and Wei, 2014). When learners translanguage, they are able to draw on all their language features in a flexible and integrated way (Otheguy, García, & Reid, 2015). In the Foreign Language Classroom, however, the tendency to use the target language only is still advocated as a pedagogy. This study attempts to enable learners in the English as a foreign language classroom to draw on their full linguistic repertoire through collaborative reading lessons. In observations prior to this study, in a classroom where English only policy prevails, learners still used their first language in group discussions yet were constrained at times by the teacher’s language policies. Through strategically enabling translanguaging in reading lessons (Celic and Seltzer, 2011), this study has revealed that learners showed creative ways of language use for learning and reflected positively on thisexperience. This case study enabled two groups in two different proficiency level classrooms who are learning English as a foreign language in their first year at University in Saudi Arabia. Learners in the two groups wereobserved over six weeks and wereasked to reflect their learning every week. The same learners were also interviewed at the end of translanguaging weeks after completing a modified model of the learning reflection (Ash and Clayton, 2009). This study positions translanguaging as collaborative and agentive within a sociocultural framework of learning, positioning translanguaging as a resource for learning as well as a process of learning. Translanguaging learning episodes are elicited from classroom observations, artefacts, interviews, reflections, and focus groups, where they are analysed qualitatively following the sociocultural discourse analysis (Fairclough &Wodak, 1997; Mercer, 2004). Initial outcomes suggest functions of translanguaging in collaborative reading tasks and recommendations for a collaborative translanguaging pedagogy approach in the EFL classroom.

Keywords: translanguaging, EFL, sociocultural theory, discourse analysis

Procedia PDF Downloads 180
5 Modelling Volatility Spillovers and Cross Hedging among Major Agricultural Commodity Futures

Authors: Roengchai Tansuchat, Woraphon Yamaka, Paravee Maneejuk

Abstract:

From the past recent, the global financial crisis, economic instability, and large fluctuation in agricultural commodity price have led to increased concerns about the volatility transmission among them. The problem is further exacerbated by commodities volatility caused by other commodity price fluctuations, hence the decision on hedging strategy has become both costly and useless. Thus, this paper is conducted to analysis the volatility spillover effect among major agriculture including corn, soybeans, wheat and rice, to help the commodity suppliers hedge their portfolios, and manage the risk and co-volatility of them. We provide a switching regime approach to analyzing the issue of volatility spillovers in different economic conditions, namely upturn and downturn economic. In particular, we investigate relationships and volatility transmissions between these commodities in different economic conditions. We purposed a Copula-based multivariate Markov Switching GARCH model with two regimes that depend on an economic conditions and perform simulation study to check the accuracy of our proposed model. In this study, the correlation term in the cross-hedge ratio is obtained from six copula families – two elliptical copulas (Gaussian and Student-t) and four Archimedean copulas (Clayton, Gumbel, Frank, and Joe). We use one-step maximum likelihood estimation techniques to estimate our models and compare the performance of these copula using Akaike information criterion (AIC) and Bayesian information criteria (BIC). In the application study of agriculture commodities, the weekly data used are conducted from 4 January 2005 to 1 September 2016, covering 612 observations. The empirical results indicate that the volatility spillover effects among cereal futures are different, as response of different economic condition. In addition, the results of hedge effectiveness will also suggest the optimal cross hedge strategies in different economic condition especially upturn and downturn economic.

Keywords: agricultural commodity futures, cereal, cross-hedge, spillover effect, switching regime approach

Procedia PDF Downloads 202
4 Inequalities in Gastrointestinal Infections between UK Ethnic Groups: A Systematic Review and Narrative Synthesis

Authors: Iram Zahair, Tanith Rose, Oyinlola Oyebode, Stephen Clayton, Iman Ghosh, Michelle Maden, Ben Barr

Abstract:

Background: Gastrointestinal infections exert a significant public health burden on UK healthcare services and the community. However, there are conflicting findings on where ethnic inequalities are likely to persist. This systematic review aimed to identify studies that ascertain differences in the incidence and prevalence of gastrointestinal infections within and between UK ethnic groups and explore possible explanations for heterogeneity observed within the literature. Methods: Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidance, a systematic review methodology was used. Medline, Web of Science, CINAHL Plus, and grey literature were searched from 1980 to 2021 for studies reporting an association between ethnicity and gastrointestinal infections in UK population samples. Two reviewers independently screened the articles and conducted quality appraisals; data extraction was undertaken by one reviewer and verified by two reviewers (PROSPERO CRD 42021240714). A narrative synthesis was undertaken to synthesise the study findings. Results: The searches identified 8134 studies; 13 met the inclusion criteria. 12 out of 13 studies found a difference in the prevalence of gastrointestinal infections between different ethnic groups. UK ethnic minorities, predominantly men and children of Asian ethnicity, had an increased risk of infection than the white British majority in 12 studies; the Pakistani ethnic group had a higher risk of infection in three out of 13 studies. Studies reported that age and sex confounded the relationship between ethnicity and gastrointestinal infections. At the same time, the country of birth, socioeconomic status, and geographical location of ethnic groups mediated this association and significantly explained the heterogeneity observed across the studies. Harvest plots supported the textual synthesis. Conclusion: This systematic review elucidates the lack of extensive UK quantitative evidence examining the association between ethnicity and gastrointestinal infections. Insights into gastrointestinal infections and ethnicity's association can help address policy actions to mitigate the inequalities identified within and between UK ethnic groups.

Keywords: ethnic and racial populations, public health, public health policy, systematic review

Procedia PDF Downloads 108
3 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 122
2 Machine Learning Techniques in Seismic Risk Assessment of Structures

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.

Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine

Procedia PDF Downloads 106
1 Co-Movement between Financial Assets: An Empirical Study on Effects of the Depreciation of Yen on Asia Markets

Authors: Yih-Wenn Laih

Abstract:

In recent times, the dependence and co-movement among international financial markets have become stronger than in the past, as evidenced by commentaries in the news media and the financial sections of newspapers. Studying the co-movement between returns in financial markets is an important issue for portfolio management and risk management. The realization of co-movement helps investors to identify the opportunities for international portfolio management in terms of asset allocation and pricing. Since the election of the new Prime Minister, Shinzo Abe, in November 2012, the yen has weakened against the US dollar from the 80 to the 120 level. The policies, known as “Abenomics,” are to encourage private investment through a more aggressive mix of monetary and fiscal policy. Given the close economic relations and competitions among Asia markets, it is interesting to discover the co-movement relations, affected by the depreciation of yen, between stock market of Japan and 5 major Asia stock markets, including China, Hong Kong, Korea, Singapore, and Taiwan. Specifically, we devote ourselves to measure the co-movement of stock markets between Japan and each one of the 5 Asia stock markets in terms of rank correlation coefficients. To compute the coefficients, return series of each stock market is first fitted by a skewed-t GARCH (generalized autoregressive conditional heteroscedasticity) model. Secondly, to measure the dependence structure between matched stock markets, we employ the symmetrized Joe-Clayton (SJC) copula to calculate the probability density function of paired skewed-t distributions. The joint probability density function is then utilized as the scoring scheme to optimize the sequence alignment by dynamic programming method. Finally, we compute the rank correlation coefficients (Kendall's  and Spearman's ) between matched stock markets based on their aligned sequences. We collect empirical data of 6 stock indexes from Taiwan Economic Journal. The data is sampled at a daily frequency covering the period from January 1, 2013 to July 31, 2015. The empirical distributions of returns indicate fatter tails than the normal distribution. Therefore, the skewed-t distribution and SJC copula are appropriate for characterizing the data. According to the computed Kendall’s τ, Korea has the strongest co-movement relation with Japan, followed by Taiwan, China, and Singapore; the weakest is Hong Kong. On the other hand, the Spearman’s ρ reveals that the strength of co-movement between markets with Japan in decreasing order are Korea, China, Taiwan, Singapore, and Hong Kong. We explore the effects of “Abenomics” on Asia stock markets by measuring the co-movement relation between Japan and five major Asia stock markets in terms of rank correlation coefficients. The matched markets are aligned by a hybrid method consisting of GARCH, copula and sequence alignment. Empirical experiments indicate that Korea has the strongest co-movement relation with Japan. The strength of China and Taiwan are better than Singapore. The Hong Kong market has the weakest co-movement relation with Japan.

Keywords: co-movement, depreciation of Yen, rank correlation, stock market

Procedia PDF Downloads 231