Search results for: Darlington Mapiye
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10

Search results for: Darlington Mapiye

10 A Voice Retrieved from the Holocaust in New Journalism in Kazuo Ishiguro's the Remains of the Day

Authors: Masami Usui

Abstract:

Kazuo Ishiguro’s The Remains of the Day (1989) underlines another holocaust, an imprisonment of human life, dignity, and self in the globalizing sphere of the twentieth century. The Remains of the Day delineates the invisible and cruel space of “lost and found” in the postcolonial and post-imperial discourse of this century, that is, the Holocaust. The context of the concentration camp or wartime imprisonment such as Auschwitz is transplanted into the public sphere of modern England, Darlington Hall. The voice is retrieved and expressed by the young journalist and heir of Darlington Hall, Mr. David Cardinal. The new media of journalism is an intruder at Darlington Hall and plays a role in revealing the wrongly-input ideology. “Lost and Found” consists of the private and public retrieved voices. Stevens’ journey in 1956 is a return to the past, especially the period between 1935 and 1936. Lost time is retrieved on his journey; yet lost life cannot be revived entirely in his remains of life. The supreme days of Darlington Hall are the terrifying days caused by the Nazis. Fascism, terrorism, and militarism destroyed the wholesomeness of the globe. Into blind Stevens, both Miss Kenton and Mr. Cardinal bring out the common issue, that is, the political conflicts caused by Nazis. Miss Kenton expresses her own ideas against anti-Semitism regarding the Jewish maids in the crucial time when Sir Oswald Mosley’s Blackshirts organization attacked the Anglo Jews between 1935 and 1936. Miss Kenton’s half-muted statement is reinforced and assured by Cardinal in his mention of the 1934 Olympic Rally threatened by Mosley’s Blackshirts. Cardinal’s invasion of Darlington Hall embodies the increasing tension of international politics related to World War II. Darlington Hall accommodates the crucial political issue that definitely influences the fate of the house, its residents, and the nation itself and that is retrieved in the newly progressive and established media.

Keywords: modern English literature, culture studies, communication, history

Procedia PDF Downloads 543
9 Genomic Sequence Representation Learning: An Analysis of K-Mer Vector Embedding Dimensionality

Authors: James Jr. Mashiyane, Risuna Nkolele, Stephanie J. Müller, Gciniwe S. Dlamini, Rebone L. Meraba, Darlington S. Mapiye

Abstract:

When performing language tasks in natural language processing (NLP), the dimensionality of word embeddings is chosen either ad-hoc or is calculated by optimizing the Pairwise Inner Product (PIP) loss. The PIP loss is a metric that measures the dissimilarity between word embeddings, and it is obtained through matrix perturbation theory by utilizing the unitary invariance of word embeddings. Unlike in natural language, in genomics, especially in genome sequence processing, unlike in natural language processing, there is no notion of a “word,” but rather, there are sequence substrings of length k called k-mers. K-mers sizes matter, and they vary depending on the goal of the task at hand. The dimensionality of word embeddings in NLP has been studied using the matrix perturbation theory and the PIP loss. In this paper, the sufficiency and reliability of applying word-embedding algorithms to various genomic sequence datasets are investigated to understand the relationship between the k-mer size and their embedding dimension. This is completed by studying the scaling capability of three embedding algorithms, namely Latent Semantic analysis (LSA), Word2Vec, and Global Vectors (GloVe), with respect to the k-mer size. Utilising the PIP loss as a metric to train embeddings on different datasets, we also show that Word2Vec outperforms LSA and GloVe in accurate computing embeddings as both the k-mer size and vocabulary increase. Finally, the shortcomings of natural language processing embedding algorithms in performing genomic tasks are discussed.

Keywords: word embeddings, k-mer embedding, dimensionality reduction

Procedia PDF Downloads 85
8 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 122
7 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 110
6 Canada Deuterium Uranium Updated Fire Probabilistic Risk Assessment Model for Canadian Nuclear Plants

Authors: Hossam Shalabi, George Hadjisophocleous

Abstract:

The Canadian Nuclear Power Plants (NPPs) use some portions of NUREG/CR-6850 in carrying out Fire Probabilistic Risk Assessment (PRA). An assessment for the applicability of NUREG/CR-6850 to CANDU reactors was performed and a CANDU Fire PRA was introduced. There are 19 operating CANDU reactors in Canada at five sites (Bruce A, Bruce B, Darlington, Pickering and Point Lepreau). A fire load density survey was done for all Fire Safe Shutdown Analysis (FSSA) fire zones in all CANDU sites in Canada. National Fire Protection Association (NFPA) Standard 557 proposes that a fire load survey must be conducted by either the weighing method or the inventory method or a combination of both. The combination method results in the most accurate values for fire loads. An updated CANDU Fire PRA model is demonstrated in this paper that includes the fuel survey in all Canadian CANDU stations. A qualitative screening step for the CANDU fire PRA is illustrated in this paper to include any fire events that can damage any part of the emergency power supply in addition to FSSA cables.

Keywords: fire safety, CANDU, nuclear, fuel densities, FDS, qualitative analysis, fire probabilistic risk assessment

Procedia PDF Downloads 95
5 An Ideational Grammatical Metaphor of Narrative History in Chinua Achebe's 'There Was a Country'

Authors: Muhammed-Badar Salihu Jibrin, Chibabi Makedono Darlington

Abstract:

This paper studied Ideational Grammatical Metaphor (IGM) of Narrative History in Chinua Achebe’s There Was a Country. It started with a narrative historical style as a recent genre out of the conventional historical writings. In order to explore the linguistic phenomenon using a particular lexico-grammatical tool of IGM, the theoretical background was examined based on Hallidayan Systemic Functional Linguistics. Furthermore, the study considered the possibility of applying IGM to the Part 4 of Achebe’s historical text with recourse to the concept of congruence in IGM and research questions before formulating a working methodology. The analysis of Achebe’s memoir was, thus, presented in tabular forms to account for the quantitative content analysis with qualitative research technique, as well as the metaphorical and congruent wording through nominalization and process types with samples. The frequencies and percentage were given appropriately with respect to each subheadings of the text. To this end, the findings showed that material and relational types indicated dominance. The discussion and implications were that the findings confirmed earlier study by MAK Halliday and C.I.M.I.M. Matthiessen’s suggestion that IGM should show dominance of material type process. The implication is that IGM can be an effective tool for the analysis of a narrative historical text. In conclusion, it was observed that IGM does not only carry grammatical function but also an ideological role in shaping the historical discourse within the narrative mode between writers and readers.

Keywords: ideational grammatical metaphor, nominalization, narrative history, memoire, dominance

Procedia PDF Downloads 178
4 X-Ray Dosimetry by a Low-Cost Current Mode Ion Chamber

Authors: Ava Zarif Sanayei, Mustafa Farjad-Fard, Mohammad-Reza Mohammadian-Behbahani, Leyli Ebrahimi, Sedigheh Sina

Abstract:

The fabrication and testing of a low-cost air-filled ion chamber for X-ray dosimetry is studied. The chamber is made of a metal cylinder, a central wire, a BC517 Darlington transistor, a 9V DC battery, and a voltmeter in order to have a cost-effective means to measure the dose. The output current of the dosimeter is amplified by the transistor and then fed to the large internal resistance of the voltmeter, producing a readable voltage signal. The dose-response linearity of the ion chamber is evaluated for different exposure scenarios by the X-ray tube. kVp values 70, 90, and 120, and mAs up to 20 are considered. In all experiments, a solid-state dosimeter (Solidose 400, Elimpex Medizintechnik) is used as a reference device for chamber calibration. Each case of exposure is repeated three times, the voltmeter and Solidose readings are recorded, and the mean and standard deviation values are calculated. Then, the calibration curve, derived by plotting voltmeter readings against Solidose readings, provided a linear fit result for all tube kVps of 70, 90, and 120. A 99, 98, and 100% linear relationship, respectively, for kVp values 70, 90, and 120 are demonstrated. The study shows the feasibility of achieving acceptable dose measurements with a simplified setup. Further enhancements to the proposed setup include solutions for limiting the leakage current, optimizing chamber dimensions, utilizing electronic microcontrollers for dedicated data readout, and minimizing the impact of stray electromagnetic fields on the system.

Keywords: dosimetry, ion chamber, radiation detection, X-ray

Procedia PDF Downloads 19
3 Synthesis of ZnO Nanoparticles with Varying Calcination Temperature for Photocatalytic Degradation of Ethylbenzene

Authors: Darlington Ashiegbu, Herman Johannes Potgieter

Abstract:

The increasing utilization of Zinc Oxide (ZnO) as a better alternative to TiO₂ has been attributed to its wide bandgap (3.37eV), lower production cost, ability to absorb over a larger range of the UV-spectrum and higher efficiency in some cases. ZnO nanoparticles were synthesized via sol-gel process and calcined at 400ᵒC, 500ᵒC, and 650ᵒC. The as-synthesized nanoparticles were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDS), and Brunauer–Emmett–Teller (BET) surface area measurement. Scanning electron micrograph revealed pseudo-spherical and rod-like morphologies and a high rate of agglomeration for the sample calcined at 650ᵒC, Brunnauer Emmett Teller (BET) surface area measurement was highest in the sample calcined at 500ᵒC, energy dispersive X-ray spectroscopy (EDS) results confirmed the purity of the samples as only Zn and O₂ were detected and X-ray diffraction (XRD) results revealed crystalline hexagonal wurtzite structure of the ZnO nanoparticles. All three samples were utilized in the degradation of ethylbenzene, and a UV-Vis spectrophotometer was utilized in monitoring degradation of ethylbenzene. The sample calcined at 500ᵒC had the highest surface area for reaction, lowest agglomeration and the highest photocatalytic activity in the degradation of ethylbenzene. This revealed temperature as a very important factor in improved and higher photocatalytic activity.

Keywords: ethylbenzene, pseudo-spherical, sol-gel, zinc oxide

Procedia PDF Downloads 122
2 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores

Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter

Abstract:

Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.

Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment

Procedia PDF Downloads 85
1 Multibody Constrained Dynamics of Y-Method Installation System for a Large Scale Subsea Equipment

Authors: Naeem Ullah, Menglan Duan, Mac Darlington Uche Onuoha

Abstract:

The lowering of subsea equipment into the deep waters is a challenging job due to the harsh offshore environment. Many researchers have introduced various installation systems to deploy the payload safely into the deep oceans. In general practice, dual floating vessels are not employed owing to the prevalent safety risks and hazards caused by ever-increasing dynamical effects sourced by mutual interaction between the bodies. However, while keeping in the view of the optimal grounds, such as economical one, the Y-method, the two conventional tugboats supporting the equipment by the two independent strands connected to a tri-plate above the equipment, has been employed to study multibody dynamics of the dual barge lifting operations. In this study, the two tugboats and the suspended payload (Y-method) are deployed for the lowering of subsea equipment into the deep waters as a multibody dynamic system. The two-wire ropes are used for the lifting and installation operation by this Y-method installation system. 6-dof (degree of freedom) for each body are considered to establish coupled 18-dof multibody model by embedding technique or velocity transformation technique. The fundamental and prompt advantage of this technique is that the constraint forces can be eliminated directly, and no extra computational effort is required for the elimination of the constraint forces. The inertial frame of reference is taken at the surface of the water as the time-independent frame of reference, and the floating frames of reference are introduced in each body as the time-dependent frames of reference in order to formulate the velocity transformation matrix. The local transformation of the generalized coordinates to the inertial frame of reference is executed by applying the Euler Angle approach. The spherical joints are articulated amongst the multibody as the kinematic joints. The hydrodynamic force, the two-strand forces, the hydrostatic force, and the mooring forces are taken into consideration as the external forces. The radiation force of the hydrodynamic force is obtained by employing the Cummins equation. The wave exciting part of the hydrodynamic force is obtained by using force response amplitude operators (RAOs) that are obtained by the commercial solver ‘OpenFOAM’. The strand force is obtained by considering the wire rope as an elastic spring. The nonlinear hydrostatic force is obtained by the pressure integration technique at each time step of the wave movement. The mooring forces are evaluated by using Faltinsen analytical approach. ‘The Runge Kutta Method’ of Fourth-Order is employed to evaluate the coupled equations of motion obtained for 18-dof multibody model. The results are correlated with the simulated Orcaflex Model. Moreover, the results from Orcaflex Model are compared with the MOSES Model from previous studies. The MBDS of single barge lifting operation from the former studies are compared with the MBDS of the established dual barge lifting operation. The dynamics of the dual barge lifting operation are found larger in magnitude as compared to the single barge lifting operation. It is noticed that the traction at the top connection point of the cable decreases with the increase in the length, and it becomes almost constant after passing through the splash zone.

Keywords: dual barge lifting operation, Y-method, multibody dynamics, shipbuilding, installation of subsea equipment, shipbuilding

Procedia PDF Downloads 163