Search results for: fusion feature
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1934

Search results for: fusion feature

464 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 300
463 MRI Quality Control Using Texture Analysis and Spatial Metrics

Authors: Kumar Kanudkuri, A. Sandhya

Abstract:

Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.

Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy

Procedia PDF Downloads 139
462 Analyzing the Influence of Hydrometeorlogical Extremes, Geological Setting, and Social Demographic on Public Health

Authors: Irfan Ahmad Afip

Abstract:

This main research objective is to accurately identify the possibility for a Leptospirosis outbreak severity of a certain area based on its input features into a multivariate regression model. The research question is the possibility of an outbreak in a specific area being influenced by this feature, such as social demographics and hydrometeorological extremes. If the occurrence of an outbreak is being subjected to these features, then the epidemic severity for an area will be different depending on its environmental setting because the features will influence the possibility and severity of an outbreak. Specifically, this research objective was three-fold, namely: (a) to identify the relevant multivariate features and visualize the patterns data, (b) to develop a multivariate regression model based from the selected features and determine the possibility for Leptospirosis outbreak in an area, and (c) to compare the predictive ability of multivariate regression model and machine learning algorithms. Several secondary data features were collected locations in the state of Negeri Sembilan, Malaysia, based on the possibility it would be relevant to determine the outbreak severity in the area. The relevant features then will become an input in a multivariate regression model; a linear regression model is a simple and quick solution for creating prognostic capabilities. A multivariate regression model has proven more precise prognostic capabilities than univariate models. The expected outcome from this research is to establish a correlation between the features of social demographic and hydrometeorological with Leptospirosis bacteria; it will also become a contributor for understanding the underlying relationship between the pathogen and the ecosystem. The relationship established can be beneficial for the health department or urban planner to inspect and prepare for future outcomes in event detection and system health monitoring.

Keywords: geographical information system, hydrometeorological, leptospirosis, multivariate regression

Procedia PDF Downloads 96
461 Virtue, Truth, Freedom, And The History Of Philosophy

Authors: Ashley DelCorno

Abstract:

GEM Anscombe’s 1958 essay Modern Moral Philosophy and the tradition of virtue ethics that followed has given rise to the restoration (or, more plainly, the resurrection) of Aristotle as something of an authority figure. Alisdair MacIntyre and Martha Nussbaum are proponents, for example, not just of Aristotle’s relevancy but also of his apparent implicit authority. That said, it’s not clear that the schema imagined by virtue ethicists accurately describes moral life or that it does not inadvertently work to impoverish genuine decision-making. If the label ‘virtue’ is categorically denied to some groups (while arbitrarily afforded to others), it can only turn on itself, thus rendering ridiculous its own premise. Likewise, as an inescapable feature of virtue ethics, Aristotelean binaries like ‘virtue/vice’ and ‘voluntary/involuntary’ offer up false dichotomies that may seriously compromise an agent’s ability to conceptualize choices that are truly free and rooted in meaningful criteria. Here, this topic is analyzed through a feminist lens predicated on the known paradoxes of patriarchy. The work of feminist theorists Jacqui Alexander, Katharine Angel, Simone de Beauvoir, bell hooks, Audre Lorde, Imani Perry, and Amia Srinivasan serves as important guideposts, and the argument here is built from a key tenet of black feminist thought regarding scarcity and possibility. Above all, it’s clear that though the philosophical tradition of virtue ethics presents itself as recovering the place of agency in ethics, its premises possess crippling limitations toward the achievement of this goal. These include, most notably, virtue ethics’ binding analysis of history, as well as its axiomatic attachment to obligatory clauses, problematic reading-in of Aristotle and arbitrary commitment to predetermined and competitively patriarchal ideas of what counts as a virtue.

Keywords: feminist history, the limits of utopic imagination, curatorial creation, truth, virtue, freedom

Procedia PDF Downloads 70
460 The Suitability of Agile Practices in Healthcare Industry with Regard to Healthcare Regulations

Authors: Mahmood Alsaadi, Alexei Lisitsa

Abstract:

Nowadays, medical devices rely completely on software whether as whole software or as embedded software, therefore, the organization that develops medical device software can benefit from adopting agile practices. Using agile practices in healthcare software development industries would bring benefits such as producing a product of a high-quality with low cost and in short period. However, medical device software development companies faced challenges in adopting agile practices. These due to the gaps that exist between agile practices and the requirements of healthcare regulations such as documentation, traceability, and formality. This research paper will conduct a study to investigate the adoption rate of agile practice in medical device software development, and they will extract and outline the requirements of healthcare regulations such as Food and Drug Administration (FDA), Health Insurance Portability and Accountability Act (HIPAA), and Medical Device Directive (MDD) that affect directly or indirectly on software development life cycle. Moreover, this research paper will evaluate the suitability of using agile practices in healthcare industries by analyzing the most popular agile practices such as eXtream Programming (XP), Scrum, and Feature-Driven Development (FDD) from healthcare industry point of view and in comparison with the requirements of healthcare regulations. Finally, the authors propose an agile mixture model that consists of different practices from different agile methods. As result, the adoption rate of agile practices in healthcare industries still low and agile practices should enhance with regard to requirements of the healthcare regulations in order to be used in healthcare software development organizations. Therefore, the proposed agile mixture model may assist in minimizing the gaps existing between healthcare regulations and agile practices and increase the adoption rate in the healthcare industry. As this research paper part of the ongoing project, an evaluation of agile mixture model will be conducted in the near future.

Keywords: adoption of agile, agile gaps, agile mixture model, agile practices, healthcare regulations

Procedia PDF Downloads 218
459 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection

Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine

Abstract:

Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.

Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine

Procedia PDF Downloads 246
458 Multi-Impairment Compensation Based Deep Neural Networks for 16-QAM Coherent Optical Orthogonal Frequency Division Multiplexing System

Authors: Ying Han, Yuanxiang Chen, Yongtao Huang, Jia Fu, Kaile Li, Shangjing Lin, Jianguo Yu

Abstract:

In long-haul and high-speed optical transmission system, the orthogonal frequency division multiplexing (OFDM) signal suffers various linear and non-linear impairments. In recent years, researchers have proposed compensation schemes for specific impairment, and the effects are remarkable. However, different impairment compensation algorithms have caused an increase in transmission delay. With the widespread application of deep neural networks (DNN) in communication, multi-impairment compensation based on DNN will be a promising scheme. In this paper, we propose and apply DNN to compensate multi-impairment of 16-QAM coherent optical OFDM signal, thereby improving the performance of the transmission system. The trained DNN models are applied in the offline digital signal processing (DSP) module of the transmission system. The models can optimize the constellation mapping signals at the transmitter and compensate multi-impairment of the OFDM decoded signal at the receiver. Furthermore, the models reduce the peak to average power ratio (PAPR) of the transmitted OFDM signal and the bit error rate (BER) of the received signal. We verify the effectiveness of the proposed scheme for 16-QAM Coherent Optical OFDM signal and demonstrate and analyze transmission performance in different transmission scenarios. The experimental results show that the PAPR and BER of the transmission system are significantly reduced after using the trained DNN. It shows that the DNN with specific loss function and network structure can optimize the transmitted signal and learn the channel feature and compensate for multi-impairment in fiber transmission effectively.

Keywords: coherent optical OFDM, deep neural network, multi-impairment compensation, optical transmission

Procedia PDF Downloads 124
457 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis

Authors: Wenbo Du, Xiaomei Ma

Abstract:

With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.

Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression

Procedia PDF Downloads 133
456 Hyperelastic Constitutive Modelling of the Male Pelvic System to Understand the Prostate Motion, Deformation and Neoplasms Location with the Influence of MRI-TRUS Fusion Biopsy

Authors: Muhammad Qasim, Dolors Puigjaner, Josep Maria López, Joan Herrero, Carme Olivé, Gerard Fortuny

Abstract:

Computational modeling of the human pelvis using the finite element (FE) method has become extremely important to understand the mechanics of prostate motion and deformation when transrectal ultrasound (TRUS) guided biopsy is performed. The number of reliable and validated hyperelastic constitutive FE models of the male pelvis region is limited, and given models did not precisely describe the anatomical behavior of pelvis organs, mainly of the prostate and its neoplasms location. The motion and deformation of the prostate during TRUS-guided biopsy makes it difficult to know the location of potential lesions in advance. When using this procedure, practitioners can only provide roughly estimations for the lesions locations. Consequently, multiple biopsy samples are required to target one single lesion. In this study, the whole pelvis model (comprised of the rectum, bladder, pelvic muscles, prostate transitional zone (TZ), and peripheral zone (PZ)) is used for the simulation results. An isotropic hyperelastic approach (Signorini model) was used for all the soft tissues except the vesical muscles. The vesical muscles are assumed to have a linear elastic behavior due to the lack of experimental data to determine the constants involved in hyperelastic models. The tissues and organ geometry is taken from the existing literature for 3D meshes. Then the biomechanical parameters were obtained under different testing techniques described in the literature. The acquired parametric values for uniaxial stress/strain data are used in the Signorini model to see the anatomical behavior of the pelvis model. The five mesh nodes in terms of small prostate lesions are selected prior to biopsy and each lesion’s final position is targeted when TRUS probe force of 30 N is applied at the inside rectum wall. Code_Aster open-source software is used for numerical simulations. Moreover, the overall effects of pelvis organ deformation were demonstrated when TRUS–guided biopsy is induced. The deformation of the prostate and neoplasms displacement showed that the appropriate material properties to organs altered the resulting lesion's migration parametrically. As a result, the distance traveled by these lesions ranged between 3.77 and 9.42 mm. The lesion displacement and organ deformation are compared and analyzed with our previous study in which we used linear elastic properties for all pelvic organs. Furthermore, the visual comparison of axial and sagittal slices are also compared, which is taken for Magnetic Resource Imaging (MRI) and TRUS images with our preliminary study.

Keywords: code-aster, magnetic resonance imaging, neoplasms, transrectal ultrasound, TRUS-guided biopsy

Procedia PDF Downloads 70
455 Recurrent Neural Networks for Complex Survival Models

Authors: Pius Marthin, Nihal Ata Tutkun

Abstract:

Survival analysis has become one of the paramount procedures in the modeling of time-to-event data. When we encounter complex survival problems, the traditional approach remains limited in accounting for the complex correlational structure between the covariates and the outcome due to the strong assumptions that limit the inference and prediction ability of the resulting models. Several studies exist on the deep learning approach to survival modeling; moreover, the application for the case of complex survival problems still needs to be improved. In addition, the existing models need to address the data structure's complexity fully and are subject to noise and redundant information. In this study, we design a deep learning technique (CmpXRnnSurv_AE) that obliterates the limitations imposed by traditional approaches and addresses the above issues to jointly predict the risk-specific probabilities and survival function for recurrent events with competing risks. We introduce the component termed Risks Information Weights (RIW) as an attention mechanism to compute the weighted cumulative incidence function (WCIF) and an external auto-encoder (ExternalAE) as a feature selector to extract complex characteristics among the set of covariates responsible for the cause-specific events. We train our model using synthetic and real data sets and employ the appropriate metrics for complex survival models for evaluation. As benchmarks, we selected both traditional and machine learning models and our model demonstrates better performance across all datasets.

Keywords: cumulative incidence function (CIF), risk information weight (RIW), autoencoders (AE), survival analysis, recurrent events with competing risks, recurrent neural networks (RNN), long short-term memory (LSTM), self-attention, multilayers perceptrons (MLPs)

Procedia PDF Downloads 69
454 HyDUS Project; Seeking a Wonder Material for Hydrogen Storage

Authors: Monica Jong, Antonios Banos, Tom Scott, Chris Webster, David Fletcher

Abstract:

Hydrogen, as a clean alternative to methane, is relatively easy to make, either from water using electrolysis or from methane using steam reformation. However, hydrogen is much trickier to store than methane, and without effective storage, it simply won’t pass muster as a suitable methane substitute. Physical storage of hydrogen is quite inefficient. Storing hydrogen as a compressed gas at pressures up to 900 times atmospheric is volumetrically inefficient and carries safety implications, whilst storing it as a liquid requires costly and constant cryogenic cooling to minus 253°C. This is where DU steps in as a possible solution. Across the periodic table, there are many different metallic elements that will react with hydrogen to form a chemical compound known as a hydride (or metal hydride). From a chemical perspective, the ‘king’ of the hydride forming metals is palladium because it offers the highest hydrogen storage volumetric capacity. However, this material is simply too expensive and scarce to be used in a scaled-up bulk hydrogen storage solution. Depleted Uranium is the second most volumetrically efficient hydride-forming metal after palladium. The UK has accrued a significant amount of DU because of manufacturing nuclear fuel for many decades, and that is currently without real commercial use. Uranium trihydride (UH3) contains three hydrogen atoms for every uranium atom and can chemically store hydrogen at ambient pressure and temperature at more than twice the density of pure liquid hydrogen for the same volume. To release the hydrogen from the hydride, all you do is heat it up. At temperatures above 250°C, the hydride starts to thermally decompose, releasing hydrogen as a gas and leaving the Uranium as a metal again. The reversible nature of this reaction allows the hydride to be formed and unformed again and again, enabling its use as a high-density hydrogen storage material which is already available in large quantities because of its stockpiling as a ‘waste’ by-product. Whilst the tritium storage credentials of Uranium have been rigorously proven at the laboratory scale and at the fusion demonstrator JET for over 30 years, there is a need to prove the concept for depleted uranium hydrogen storage (HyDUS) at scales towards that which is needed to flexibly supply our national power grid with energy. This is exactly the purpose of the HyDUS project, a collaborative venture involving EDF as the interested energy vendor, Urenco as the owner of the waste DU, and the University of Bristol with the UKAEA as the architects of the technology. The team will embark on building and proving the world’s first pilot scale demonstrator of bulk chemical hydrogen storage using depleted Uranium. Within 24 months, the team will attempt to prove both the technical and commercial viability of this technology as a longer duration energy storage solution for the UK. The HyDUS project seeks to enable a true by-product to wonder material story for depleted Uranium, demonstrating that we can think sustainably about unlocking the potential value trapped inside nuclear waste materials.

Keywords: hydrogen, long duration storage, storage, depleted uranium, HyDUS

Procedia PDF Downloads 129
453 Lexical Features and Motivations of Product Reviews on Selected Philippine Online Shops

Authors: Jimmylen Tonio, Ali Anudin, Rochelle Irene G. Lucas

Abstract:

Alongside the progress of electronic-business websites, consumers have become more comfortable with online shopping. It has become customary for consumers that prior to purchasing a product or availing services, they consult online reviews info as bases in evaluating and deciding whether or not they should push thru with their procurement of the product or service. Subsequently, after purchasing, consumers tend to post their own comments of the product in the same e-business websites. Because of this, product reviews (PRS) have become an indispensable feature in online businesses equally beneficial for both business owners and consumers. This study explored the linguistic features and motivations of online product reviews on selected Philippine online shops, LAZADA and SHOPEE. Specifically, it looked into the lexical features of the PRs, the factors that motivated consumers to write the product reviews, and the difference of lexical preferences between male and female when they write the reviews. The findings revealed the following: 1. Formality of words in online product reviews primarily involves non-standard spelling, followed by abbreviated word forms, colloquial contractions and use of coined/novel words; 2. Paralinguistic features in online product reviews are dominated by the use of emoticons, capital letters and punctuations followed by the use of pictures/photos and lastly, by paralinguistic expressions; 3. The factors that motivate consumers to write product reviews varied. Online product reviewers are predominantly driven by venting negative feelings motivation, followed by helping the company, helping other consumers, positive self-enhancement, advice seeking and lastly, by social benefits; and 4. Gender affects the word frequencies of product online reviews, while negation words, personal pronouns, the formality of words, and paralinguistic features utilized by both male and female online product reviewers are not different.

Keywords: lexical choices, motivation, online shop, product reviews

Procedia PDF Downloads 132
452 The Feasibility Evaluation Of The Compressed Air Energy Storage System In The Porous Media Reservoir

Authors: Ming-Hong Chen

Abstract:

In the study, the mechanical and financial feasibility for the compressed air energy storage (CAES) system in the porous media reservoir in Taiwan is evaluated. In 2035, Taiwan aims to install 16.7 GW of wind power and 40 GW of photovoltaic (PV) capacity. However, renewable energy sources often generate more electricity than needed, particularly during winter. Consequently, Taiwan requires long-term, large-scale energy storage systems to ensure the security and stability of its power grid. Currently, the primary large-scale energy storage options are Pumped Hydro Storage (PHS) and Compressed Air Energy Storage (CAES). Taiwan has not ventured into CAES-related technologies due to geological and cost constraints. However, with the imperative of achieving net-zero carbon emissions by 2050, there's a substantial need for the development of a considerable amount of renewable energy. PHS has matured, boasting an overall installed capacity of 4.68 GW. CAES, presenting a similar scale and power generation duration to PHS, is now under consideration. Taiwan's geological composition, being a porous medium unlike salt caves, introduces flow field resistance affecting gas injection and extraction. This study employs a program analysis model to establish the system performance analysis capabilities of CAES. The finite volume model is then used to assess the impact of porous media, and the findings are fed back into the system performance analysis for correction. Subsequently, the financial implications are calculated and compared with existing literature. For Taiwan, the strategic development of CAES technology is crucial, not only for meeting energy needs but also for decentralizing energy allocation, a feature of great significance in regions lacking alternative natural resources.

Keywords: compressed-air energy storage, efficiency, porous media, financial feasibility

Procedia PDF Downloads 51
451 Study on Electromagnetic Plasma Acceleration Using Rotating Magnetic Field Scheme

Authors: Takeru Furuawa, Kohei Takizawa, Daisuke Kuwahara, Shunjiro Shinohara

Abstract:

In the field of a space propulsion, an electric propulsion system has been developed because its fuel efficiency is much higher than a conventional chemical one. However, the practical electric propulsion systems, e.g., an ion engine, have a problem of short lifetime due to a damage of generation and acceleration electrodes of the plasma. A helicon plasma thruster is proposed as a long-lifetime electric thruster which has non-direct contact electrodes. In this system, both generation and acceleration methods of a dense plasma are executed by antennas from the outside of a discharge tube. Development of the helicon plasma thruster has been conducting under the Helicon Electrodeless Advanced Thruster (HEAT) project. Our helicon plasma thruster has two important processes. First, we generate a dense source plasma using a helicon wave with an excitation frequency between an ion and an electron cyclotron frequencies, fci and fce, respectively, applied from the outside of a discharge using a radio frequency (RF) antenna. The helicon plasma source can provide a high-density (~1019 m-3), a high-ionization ratio (up to several tens of percent), and a high particle generation efficiency. Second, in order to achieve high thrust and specific impulse, we accelerate the dense plasma by the axial Lorentz force fz using the product of the induced azimuthal current jθ and the static radial magnetic field Br, shown as fz = jθ × Br. The HEAT project has proposed several kinds of electrodeless acceleration schemes, and in our particular case, a Rotating Magnetic Field (RMF) method has been extensively studied. The RMF scheme was originally developed as a concept to maintain the Field Reversed Configuration (FRC) in a magnetically confined fusion research. Here, RMF coils are expected to generate jθ due to a nonlinear effect shown below. First, the rotating magnetic field Bω is generated by two pairs of RMF coils with AC currents, which have a phase difference of 90 degrees between the pairs. Due to the Faraday’s law, an axial electric field is induced. Second, an axial current is generated by the effects of an electron-ion and an electron-neutral collisions through the Ohm’s law. Third, the azimuthal electric field is generated by the nonlinear term, and the retarding torque generated by the collision effects again. Then, azimuthal current jθ is generated as jθ = - nₑ er ∙ 2π fRMF. Finally, the axial Lorentz force fz for plasma acceleration is generated. Here, jθ is proportional to nₑ and frequency of RMF coil current fRMF, when Bω is fully penetrated into the plasma. Our previous study has achieved 19 % increase of ion velocity using the 5 MHz and 50 A of the RMF coil power supply. In this presentation, we will show the improvement of the ion velocity using the lower frequency and higher current supplied by RMF power supply. In conclusion, helicon high-density plasma production and electromagnetic acceleration by the RMF scheme with a concept of electrodeless condition have been successfully executed.

Keywords: electric propulsion, electrodeless thruster, helicon plasma, rotating magnetic field

Procedia PDF Downloads 244
450 Histological and Ultrastructural Study on the Effect

Authors: Olfat Mohamed Hussien Yousef

Abstract:

Tamoxifen (TM) is a synthetic non-steroidal antiestrogen. It is one of the most effective drugs for treatment of estrogen-dependent cancer by binding to estrogen receptors, suppressing of epithelial proliferation and as a chemotherapeutic agent. Recently, more attention has been paid to the protective effects of natural antioxidants against toxicities induced by anti-cancer drugs involving free radical-mediated oxidative stress and tissue injury. Vitamin C is a potent antioxidant that has the ability to scavenge factors causing free radical formation in animals receiving tamoxifen. The present study aims at pinpointing the TM-induced histopathological and ultrastructural changes in the kidneys and to assess the possible chemoprotective role of vitamin C against such TM-induced microscopic changes. Thirty adult male CD-1 mice, 25-30 g in weight and 3 months old, were divided into three groups. The first group served as control. The second group received the therapeutic dose of TM at daily oral dose of 40 mg/kg body weight for 28 days. The third group received the therapeutic dose of vitamin C at a daily dose of 500 mg/kg body weight simultaneously with the therapeutic dose of TM used in group two for 28 days. Animals were sacrificed and kidney samples were obtained and processed for histological and ultrastructural examination. Histological changes induced by TM included damage of the renal corpuscles including obliteration of the subcapsular space, congestion of the glomerular blood capillaries, segmental mesangial cell proliferation with matrix expansion, capsular adhesions with the glomerular tuft especially at the urinary pole of the corpuscles. Moreover, some proximal and distal tubules suffered various degrees of degeneration in some lining cells. Haemorrhage and inflammatory cell infiltration were also observed in the intertubular spaces. Ultrastructural observations revealed damage of the parietal epithelium of Bowman’s capsule, fusion and destruction of the foot processes of podocytes and great increase of mesangial cells and mesangial matrix. The cells of the proximal convoluted tubules displayed marked destruction of the microvilli constituting the brush borders and degeneration of the mitochondria; besides, abundant lysosomes, numerous vacuoles and pyknotic nuclei were observed. The distal convoluted tubules displayed marked distruction of both the basal infolding and the mitochondria in some areas. Histological and ultrastructural results revealed that treatment of male mice with TM simultaneously with vitamin C led to apparent repair of the injured renal tissue. This might suggest that vitamin C (an antioxidant agent) can minimize the toxic effects of TM (an antiestrogen).

Keywords: tamoxifen, vitamin c, mammalian kidney, histology, ultrastructure

Procedia PDF Downloads 362
449 The Grievances Theory versus Transnationalism and the Cameroon Anglophone Question, 1961-2017

Authors: Nkatow Mafany Christian

Abstract:

No other period in human history has offered such great opportunities for grievances not only to last long but also to be manifested across international boundaries. This state of affairs is likely a common feature of the advent of social media. The Anglophone Question in Cameroon has been a problem of poor constitutional arrangements that can be traced to 1961 when the former French Cameroon reunified with former British Southern Cameroons following a plebiscite in which the latter overwhelmingly voted to reunify with the former. Though Southern/Anglophone Cameroons complained of perceived marginalization and an attempt by the majority French section to assimilate them, the manifestation was subtle and took place only through protests, petitions, strikes movements and demonstrations. However, with the advent of social media, a new cream of leaders emerged in the diaspora, including the US, Canada, Europe, Asia and the Middle East, to champion the manifestations leading to violence and conflicts that have bedeviled the region since 2017. The feeling of political subjugation, economic exploitation, social suppression and cultural assimilation among Anglophone Cameroonians united them under diaspora leaders against the government of Cameroon, calling for the creation of a separate state for Anglophones. This paper draws from this lead-up to analyze the current Anglophone Crisis in Cameroon in the light of the Grievance Theory and Transnationalism. The paper makes an appeal to field experience, interviews, official sources, documentation, and the internet to succor its central thesis. From the fertility of its sources, the paper submits that social media is a potent source of conflicts and makes nonsense of the principle of sovereignty and territorial integrity by its capacity to promote the transnational manifestation of grievances.

Keywords: grievance, transnationalism, anglophone crisis, Cameroon, crisis and social media

Procedia PDF Downloads 49
448 Water Quality Calculation and Management System

Authors: H. M. B. N Jayasinghe

Abstract:

The water is found almost everywhere on Earth. Water resources contain a lot of pollution. Some diseases can be spread through the water to the living beings. So to be clean water it should undergo a number of treatments necessary to make it drinkable. So it is must to have purification technology for the wastewater. So the waste water treatment plants act a major role in these issues. When considering the procedures taken after the water treatment process was always based on manual calculations and recordings. Water purification plants may interact with lots of manual processes. It means the process taking much time consuming. So the final evaluation and chemical, biological treatment process get delayed. So to prevent those types of drawbacks there are some computerized programmable calculation and analytical techniques going to be introduced to the laboratory staff. To solve this problem automated system will be a solution in which guarantees the rational selection. A decision support system is a way to model data and make quality decisions based upon it. It is widely used in the world for the various kind of process automation. Decision support systems that just collect data and organize it effectively are usually called passive models where they do not suggest a specific decision but only reveal information. This web base system is based on global positioning data adding facility with map location. Most worth feature is SMS and E-mail alert service to inform the appropriate person on a critical issue. The technological influence to the system is HTML, MySQL, PHP, and some other web developing technologies. Current issues in the computerized water chemistry analysis are not much deep in progress. For an example the swimming pool water quality calculator. The validity of the system has been verified by test running and comparison with an existing plant data. Automated system will make the life easier in productively and qualitatively.

Keywords: automated system, wastewater, purification technology, map location

Procedia PDF Downloads 231
447 Analysis of Solid Waste Management Practices and the Implications for Human Health and the Environment: A Case Study of Kayamandi Informal Settlement

Authors: Peter Iyobosa Asemota

Abstract:

This study on solid waste management practices addressed aspects of environmental and health impacts resulting from poor management of solid waste. The study was occasioned by the observed rate and volume of illegal and indiscriminate dumping of solid waste materials especially in informal settlements. The main focus of this study was to establish the impact of waste management practices on human health and the environment. The study, therefore, presents a critical analysis of the state of solid waste management in the study area and the implications for human health and the environment. The study was carried out in Kayamandi informal settlement within Stellenbosch municipality. The sustainable management of solid waste is very important in order to minimize the environmental and public health risks associated with improper solid waste management. There is no denying the fact that the problems of waste management will become critical as time goes on because of improper and inefficient waste management practices. Towns and cities exhibit the burdens of waste management which is a characteristics feature of most African cities. The study critically assess the implementation of waste management practices by the residents of the informal settlement; identify the factors affecting management issues in the operation of solid waste management system by the municipality; identify factors militating against the implementation of waste management policies and legislation. Furthermore, a waste assessment study was carried out to assess the generation; composition of the waste stream and also determine the attitudes and behavior of the residents with regard to waste management practices. Findings from the study revealed that Kayamandi is not different from other informal settlements with regards to waste management. People are of the opinion that solid waste management is the sole responsibility of municipal authorities and as such, the government should be responsible for bearing the cost of solid waste management.

Keywords: environment, waste, waste composition, waste stream, policy, waste categories, sanitary landfill, waste collection, integrated solid waste management

Procedia PDF Downloads 670
446 Spatio-Temporal Analysis of Drought in Cholistan Region, Pakistan: An Application of Standardized Precipitation Index

Authors: Qurratulain Safdar

Abstract:

Drought is a temporary aberration in contrast to aridity, as it is a permanent feature of climate. Virtually, it takes place in all types of climatic regions that range from high to low rainfall areas. Due to the wide latitudinal extent of Pakistan, there is seasonal and annual variability in rainfall. The south-central part of the country is arid and hyper-arid. This study focuses on the spatio-temporal analysis of droughts in arid and hyperarid region of Cholistan using the standardized precipitation index (SPI) approach. This study has assessed the extent of recurrences of drought and its temporal vulnerability to drought in Cholistan region. Initially, the paper described the geographic setup of the study area along with a brief description of the drought conditions that prevail in Pakistan. The study also provides a scientific foundation for preparing literature and theoretical framework in-line with the selected parameters and indicators. Data were collected both from primary and secondary data sources. Rainfall and temperature data were obtained from Pakistan Meteorology Department. By applying geostatistical approach, a standardized precipitation index (SPI) was calculated for the study region, and the value of spatio-temporal variability of drought and its severity was explored. As a result, in-depth spatial analysis of drought conditions in Cholistan area was found. Parallel to this, drought-prone areas with seasonal variation were also identified using Kriging spatial interpolation techniques in a GIS environment. The study revealed that there is temporal variation in droughts' occurrences both in time series and SPI values. The paper is finally concluded, and strategic plan was suggested to minimize the impacts of drought.

Keywords: Cholistan desert, climate anomalies, metrological droughts, standardized precipitation index

Procedia PDF Downloads 182
445 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification

Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine

Abstract:

Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.

Keywords: convolution, feature extraction, image analysis, validation, precision agriculture

Procedia PDF Downloads 295
444 Numerical Analysis of Laminar Reflux Condensation from Gas-Vapour Mixtures in Vertical Parallel Plate Channels

Authors: Foad Hassaninejadafarahani, Scott Ormiston

Abstract:

Reflux condensation occurs in a vertical channels and tubes when there is an upward core flow of vapor (or gas-vapor mixture) and a downward flow of the liquid film. The understanding of this condensation configuration is crucial in the design of reflux condensers, distillation columns, and in loss-of-coolant safety analyses in nuclear power plant steam generators. The unique feature of this flow is the upward flow of the vapor-gas mixture (or pure vapor) that retards the liquid flow via shear at the liquid-mixture interface. The present model solves the full, elliptic governing equations in both the film and the gas-vapor core flow. The computational mesh is non-orthogonal and adapts dynamically the phase interface, thus produces sharp and accurate interface. Shear forces and heat and mass transfer at the interface are accounted for fundamentally. This modeling is a big step ahead of current capabilities by removing the limitations of previous reflux condensation models which inherently cannot account for the detailed local balances of shear, mass, and heat transfer at the interface. Discretisation has been done based on a finite volume method and a co-located variable storage scheme. An in-house computer code was developed to implement the numerical solution scheme. Detailed results are presented for laminar reflux condensation from steam-air mixtures flowing in vertical parallel plate channels. The results include velocity and pressure profiles, as well as axial variations of film thickness, Nusselt number and interface gas mass fraction.

Keywords: Reflux, Condensation, CFD-Two Phase, Nusselt number

Procedia PDF Downloads 349
443 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany

Authors: Bara' Al-Mistarehi

Abstract:

Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.

Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure

Procedia PDF Downloads 159
442 Thick Data Techniques for Identifying Abnormality in Video Frames for Wireless Capsule Endoscopy

Authors: Jinan Fiaidhi, Sabah Mohammed, Petros Zezos

Abstract:

Capsule endoscopy (CE) is an established noninvasive diagnostic modality in investigating small bowel disease. CE has a pivotal role in assessing patients with suspected bleeding or identifying evidence of active Crohn's disease in the small bowel. However, CE produces lengthy videos with at least eighty thousand frames, with a frequency rate of 2 frames per second. Gastroenterologists cannot dedicate 8 to 15 hours to reading the CE video frames to arrive at a diagnosis. This is why the issue of analyzing CE videos based on modern artificial intelligence techniques becomes a necessity. However, machine learning, including deep learning, has failed to report robust results because of the lack of large samples to train its neural nets. In this paper, we are describing a thick data approach that learns from a few anchor images. We are using sound datasets like KVASIR and CrohnIPI to filter candidate frames that include interesting anomalies in any CE video. We are identifying candidate frames based on feature extraction to provide representative measures of the anomaly, like the size of the anomaly and the color contrast compared to the image background, and later feed these features to a decision tree that can classify the candidate frames as having a condition like the Crohn's Disease. Our thick data approach reported accuracy of detecting Crohn's Disease based on the availability of ulcer areas at the candidate frames for KVASIR was 89.9% and for the CrohnIPI was 83.3%. We are continuing our research to fine-tune our approach by adding more thick data methods for enhancing diagnosis accuracy.

Keywords: thick data analytics, capsule endoscopy, Crohn’s disease, siamese neural network, decision tree

Procedia PDF Downloads 126
441 Analysis of Extracellular Vesicles Interactomes of two Isoforms of Tau Protein via SHSY-5Y Cell Lines

Authors: Mohammad Aladwan

Abstract:

Alzheimer’s disease (AD) is a widespread dementing illness with a complex and poorly understood etiology. An important role in improving our understanding of the AD process is the modeling of disease-associated changes in tau protein phosphorylation, a protein known to mediate events essential to the onset and progression of AD. A main feature of AD is the abnormal phosphorylation of tau protein and the presence of neurofibrillary tangles. In order to evaluate the respective roles of the microtubule-binding region (MTBR) and alternatively spliced exons in the N-terminal projection domains in AD, we have constructed SHSY-5Y cell lines that stably overexpress four different species of tau protein (4R2N, 4R0N, N(E-2), N(E+2)). Since the toxicity and spreading of tau lesions in AD depends on the interactions of tau with other proteins, we have performed a proteomic analysis of exosome-fraction interactomes for cell lysates and media samples that were isolated from SHSY-5Y cell lines. Functional analysis of tau interactomes based on gene ontology (GO) terms was performed using the String 10.5 database program. The highest number of exosomes proteomes and tau associated proteins were found with 4R2N isoform (2771 and 159) in cell lysate and they have a high strength of connectivity (78%) between proteins, while N(E-2) isoform in the media proteomes has the highest number of proteins and tau associated protein (1829 and 205). Moreover, known AD markers were significantly enriched in secreted interactomes relative to lysate interactomes in the SHSY-5Y cells of tau isoforms lacking exons 2 and 3 in the N-terminal. The lack of exon 2 (E-2) from tau protein can be mediated by tau secretion and spreading to different cells. Enriched functions in the secreted E-2 interactome include signaling and developmental pathways that have been linked to a) tau misprocessing and lesion development and b) tau secretion and which, therefore, could play novel roles in AD pathogenesis.

Keywords: Alzheimer's disease, dementia, tau protein, neurodegenration disease

Procedia PDF Downloads 77
440 A Novel Methodology for Browser Forensics to Retrieve Searched Keywords from Windows 10 Physical Memory Dump

Authors: Dija Sulekha

Abstract:

Nowadays, a good percentage of reported cybercrimes involve the usage of the Internet, directly or indirectly for committing the crime. Usually, Web Browsers leave traces of browsing activities on the host computer’s hard disk, which can be used by investigators to identify internet-based activities of the suspect. But criminals, who involve in some organized crimes, disable browser file generation feature to hide the evidence while doing illegal activities through the Internet. In such cases, even though browser files were not generated in the storage media of the system, traces of recent and ongoing activities were generated in the Physical Memory of the system. As a result, the analysis of Physical Memory Dump collected from the suspect's machine retrieves lots of forensically crucial information related to the browsing history of the Suspect. This information enables the cyber forensic investigators to concentrate on a few highly relevant selected artefacts while doing the Offline Forensics analysis of storage media. This paper addresses the reconstruction of web browsing activities by conducting live forensics to identify searched terms, downloaded files, visited sites, email headers, email ids, etc. from the physical memory dump collected from Windows 10 Systems. Well-known entry points are available for retrieving all the above artefacts except searched terms. The paper describes a novel methodology to retrieve the searched terms from Windows 10 Physical Memory. The searched terms retrieved in this way can be used for doing advanced file and keyword search in the storage media files reconstructed from the file system recovery in offline forensics.

Keywords: browser forensics, digital forensics, live Forensics, physical memory forensics

Procedia PDF Downloads 99
439 Analysis of Turkish Government Cultural Portal for Supporting Gastronomy Tourism

Authors: Hilmi Rafet Yüncü

Abstract:

Today Internet has very important role to promote products and services all over the world. Companies and destinations in tourism industry use Internet to sell and to promote their core products to directly potential tourists. Internet technologies have redefined the relationships between tourists, tourism companies, and travel agents. The new relationship allows for accessing and tapping tourism information and services. Internet technologies ensure new opportunities to available for the tourism industry, including travel accommodation, and tourist destination organizations. Websites are important devices to the marketing of a destination. Most people make a research about the destination before arriving via internet. Governments have a considerable role in the process of marketing tourism destinations. Governments make policies and regulations; furthermore, they help to market destinations to potential tourists. Governments have a comprehensive overview of the sector to see changes in tourism market and design better policies, programs and marketing plans. At the same time, governments support developing of alternative tourism in the country with regulations and marketing tools. The aim of this study is to analyse of an Internet website of governmental tourism portal in Turkey to determine effectiveness about gastronomy tourism. The Turkish government has established a culture portal for foreign and local tourists. The Portal provides local and general information about tourism attractions of cities and Turkey. There are 81 official cities in Turkey and all these cities are conducted to analyse to determine how effective marketing is done by Turkish Government in the manner of gastronomy tourism. A content analysis will be conducted to Internet website of the portal with food content, recipes and gastronomic feature of cities.

Keywords: culture portal, gastronomy tourism, government, Turkey

Procedia PDF Downloads 328
438 Insight into the Visual Attentional Correlates Underpinning Autistic-Like Traits in Fragile X and Down Syndrome

Authors: Jennifer M. Glennon, Hana D'Souza, Luke Mason, Annette Karmiloff-Smith, Michael S. C. Thomas

Abstract:

Genetic syndrome groups that feature high rates of autism comorbidity, like Down syndrome (DS) and fragile X syndrome (FXS), have been presented as useful models for understanding risk and protective factors involved in the emergence of autistic traits. Yet despite reaching clinical thresholds, these ‘syndromic’ forms of autism appear to differ in important ways from the idiopathic or ‘non-syndromic’ autism phenotype. To uncover the true nature of these comorbidities, it is necessary to extend definitions of autism to include the cognitive characteristics of the disorder and to then apply this broadened conceptualisation to the study of syndromic autism profiles. The current study employs a variety of well-established eye-tracking paradigms to assess visual attentional performance in children with DS and FXS who reach thresholds for autism on the Social Communication Questionnaire. It investigates whether autism profiles in these children are accompanied by visual orienting difficulties (‘sticky attention’), decreased social attention, and enhanced visual search performance, all of which are characteristic of the idiopathic autism phenotype. Data is collected from children with DS and FXS aged between 6 and 10 years, in addition to two control groups matched on age and intellectual ability (i.e., children with idiopathic autism and neurotypical controls). Cross-sectional developmental trajectory analyses are conducted to enable visuo-attentional profile comparisons. Significant differences in the visuo-attentional processes underpinning autism presentations in children with FXS and DS are hypothesised, supporting notions of syndrome specificity. The study provides insight into the complex heterogeneity associated with syndromic autism presentations and autism per se, with clinical implications for the utility of autism intervention programmes in DS and FXS populations.

Keywords: autism, down syndrome, fragile X syndrome, eye tracking

Procedia PDF Downloads 214
437 Stress and Rhythm in the Educated Nigerian Accent of English

Authors: Nkereke M. Essien

Abstract:

The intention of this paper is to examine stress in the Educated Nigerian Accent of English (ENAE) with the aim of analyzing stress and rhythmic patterns of Nigerian English. Our aim also is to isolate differences and similarities in the stress patterns studied and also know what forms the accent of these Educated Nigerian English (ENE) which marks them off from other groups or English’s of the world, to ascertain and characterize it and to provide documented evidence for its existence. Nigerian stress and rhythmic patterns are significantly different from the British English stress and rhythmic patterns consequently, the educated Nigerian English (ENE) features more stressed syllables than the native speakers’ varieties. The excessive stressed of syllables causes a contiguous “Ss” in the rhythmic flow of ENE, and this brings about a “jerky rhythm’ which distorts communication. To ascertain this claim, ten (10) Nigerian speakers who are educated in the English Language were selected by a stratified Random Sampling technique from two Federal Universities in Nigeria. This classification belongs to the education to the educated class or standard variety. Their performance was compared to that of a Briton (control). The Metrical system of analysis was used. The respondents were made to read some words and utterance which was recorded and analyzed perceptually, statistically and acoustically using the one-way Analysis of Variance (ANOVA). The Turky-Kramer Post Hoc test, the Wilcoxon Matched Pairs Signed Ranks test, and the Praat analysis software were used in the analysis. It was revealed from our findings that the Educated Nigerian English speakers feature more stressed syllables in their productions by spending more time in pronouncing stressed syllables and sometimes lesser time in pronouncing the unstressed syllables. Their overall tempo was faster. The ENE speakers used tone to mark prominence while the native speaker used stress to mark pronounce, typified by the control. We concluded that the stress pattern of the ENE speakers was significantly different from the native speaker’s variety represented by the control’s performance.

Keywords: accent, Nigerian English, rhythm, stress

Procedia PDF Downloads 218
436 Microbial Electrochemical Remediation System: Integrating Wastewater Treatment with Simultaneous Power Generation

Authors: Monika Sogani, Zainab Syed, Adrian C. Fisher

Abstract:

Pollution of estrogenic compounds has caught the attention of researchers as the slight increase of estrogens in the water bodies has a significant impact on the aquatic system. They belong to a class of endocrine disrupting compounds (EDCs) and are able to mimic hormones or interfere with the action of endogenous hormones. The microbial electrochemical remediation system (MERS) is employed here for exploiting an electrophototrophic bacterium for evaluating the capacity of biodegradation of ethinylestradiol hormone (EE2) under anaerobic conditions with power generation. MERS using electro-phototrophic bacterium offers a tailored solution of wastewater treatment in a developing country like India which has a huge solar potential. It is a clean energy generating technology as they require only sunlight, water, nutrients, and carbon dioxide to operate. Its main feature that makes it superior over other technologies is that the main fuel for this MERS is sunlight which is indefinitely present. When grown in light with organic compounds, these photosynthetic bacteria generate ATP by cyclic photophosphorylation and use carbon compounds to make cell biomass (photoheterotrophic growth). These cells showed EE2 degradation and were able to generate hydrogen as part of the process of nitrogen fixation. The two designs of MERS were studied, and a maximum of 88.45% decrease in EE2 was seen in a total period of 14 days in the better design. This research provides a better insight into microbial electricity generation and self-sustaining wastewater treatment facilities. Such new models of waste treatment aiming waste to energy generation needs to be followed and implemented for building a resource efficient and sustainable economy.

Keywords: endocrine disrupting compounds, ethinylestradiol, microbial electrochemical remediation systems, wastewater treatment

Procedia PDF Downloads 101
435 Factor Structure of the Korean Version of Multidimensional Experiential Avoidance Questionnaire (MEAQ)

Authors: Juyeon Lee, Sungeun You

Abstract:

Experiential avoidance is one’s tendency to avoid painful internal experience, unwanted adverse thoughts, emotions, and physical sensations. The Multidimensional Experiential Avoidance Questionnaire (MEAQ) is a measure of experiential avoidance, and the original scale consisted of 62 items with six subfactors including behavioral avoidance, distress aversion, procrastination, distraction/suppression, repression/denial, and distress endurance. The purpose of this study was to examine the factor structure of the MEAQ in a Korean sample. Three hundred community adults and university students aged 18 to 35 participated in an online survey assessing experiential avoidance (MEAQ and Acceptance and Action Questionnaire-II; AAQ-II), depression (Patient Health Questionnaire-9; PHQ-9), anxiety (Generalized Anxiety Disoder-7; GAD-7), negative affect (Positive and Negative Affect Scale; PANAS), neuroticism (Big Five Inventory; BFI), and quality of life (Satisfaction with Life Scale; SWLS). Factor analysis with principal axis with direct oblimin rotation was conducted to examine subfactors of the MEAQ. Results indicated that the six-factor structure of the original scale was adequate. Eight items out of 62 items were removed due to insufficient factor loading. These items included 3 items of behavior avoidance (e.g., “When I am hurting, I would do anything to feel better”), 2 items of repression/denial (e.g., “I work hard to keep out upsetting feelings”), and 3 items of distress aversion (e.g., “I prefer to stick to what I am comfortable with, rather than try new activities”). The MEAQ was positively associated with the AAQ-II (r = .47, p < .001), PHQ-9 (r = .37, p < .001), GAD-7 (r = .34, p < .001), PANAS (r = .35, p < .001), and neuroticism (r = .24, p < .001), and negatively correlated with the SWLS (r = -.38, p < .001). Internal consistency was good for the MEAQ total (Cronbach’s α = .90) as well as all six subfactors (Cronbach’s α = .83 to .87). The findings of the study support the multidimensional feature of experiential avoidance and validity of the MEAQ in a sample of Korean adults.

Keywords: avoidance, experiential avoidance, factor structure, MEAQ

Procedia PDF Downloads 346