Search results for: Korean linguistic feature
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2611

Search results for: Korean linguistic feature

541 Analysis of Practical Guidelines for Mobile Device Security in Indonesia Based on NIST SP 1800-4

Authors: Mardiyansyah Mardiyansyah, Hendrik Maulana, Eka Kurnia Sari, Imam Baehaki, Mohammad Agus Prihandono

Abstract:

Mobile device has become a key feature in Indonesian society and the economy, including government and private sector. Enterprises and government agencies already have a concern about mobile device security. However, small and medium enterprises (SME) do not have that sense yet, especially the new startups company. Indonesia has several laws, regulations, and standards for managing security in mobile devices. Currently, Indonesian information security policies have not been harmonized, each government organization and large enterprise has its own rules and policies. It leads to a conflict of interest among government agencies. This will certainly cause ineffectiveness in the implementation of policies. Therefore, an analysis of various government policies, regulations, and standards related to information security, especially on mobile devices, is carried out. This analysis is conducted to map the existing regulatory policies and standards into practical guidelines regarding NIST's information security to show the effectiveness of NIST SP 1800-4 towards existing policies. This work focused on the mapping of the NIST SP 1800-4 framework towards existing regulations, standards, and guidelines in Indonesia. The research approach is literature study to identify existing regulations, standards, and guidelines then the regulation mapped into the NIST SP 1800-4 framework and analyzed whether the framework could be applied to the organization in Indonesia. Finally, the finding and recommendations by documenting the security characteristics can be concluded. Based on the research finding, some of the regulations, standards, and guidelines in Indonesia are relevant to the elements in the NIST SP 1800-4 framework. From mapping analysis, the strength and weakness of mobile device security in Indonesia can be reported. It also can be concluded that the application of NIST SP 1800-4 can improve the effectiveness of mobile device security policies in Indonesia.

Keywords: mobile security, mobile security framework, NIST SP 1800-4, regulations

Procedia PDF Downloads 128
540 Lower Limb Oedema in Beckwith-Wiedemann Syndrome

Authors: Mihai-Ionut Firescu, Mark A. P. Carson

Abstract:

We present a case of inferior vena cava agenesis (IVCA) associated with bilateral deep venous thrombosis (DVT) in a patient with Beckwith-Wiedemann syndrome (BWS). In adult patients with BWS presenting with bilateral lower limb oedema, specific aetiological factors should be considered. These include cardiomyopathy and intraabdominal tumours. Congenital malformations of the IVC, through causing relative venous stasis, can lead to lower limb oedema either directly or indirectly by favouring lower limb venous thromboembolism; however, they are yet to be reported as an associated feature of BWS. Given its life-threatening potential, the prompt initiation of treatment for bilateral DVT is paramount. In BWS patients, however, this can prove more complicated. Due to overgrowth, the above-average birth weight can continue throughout childhood. In this case, the patient’s weight reached 170 kg, impacting on anticoagulation choice, as direct oral anticoagulants have a limited evidence base in patients with a body mass above 120 kg. Furthermore, the presence of IVCA leads to a long-term increased venous thrombosis risk. Therefore, patients with IVCA and bilateral DVT warrant specialist consideration and may benefit from multidisciplinary team management, with hematology and vascular surgery input. Conclusion: Here, we showcased a rare cause for bilateral lower limb oedema, respectively bilateral deep venous thrombosis complicating IVCA in a patient with Beckwith-Wiedemann syndrome. The importance of this case lies in its novelty, as the association between IVC agenesis and BWS has not yet been described. Furthermore, the treatment of DVT in such situations requires special consideration, taking into account the patient’s weight and the presence of a significant, predisposing vascular abnormality.

Keywords: Beckwith-Wiedemann syndrome, bilateral deep venous thrombosis, inferior vena cava agenesis, venous thromboembolism

Procedia PDF Downloads 207
539 Development of Materials Based on Phosphates of NaZr2(PO4)3 with Low Thermal Expansion

Authors: V. Yu. Volgutov, A. I. Orlova, S. A. Khainakov

Abstract:

NaZr2(PO4)3 (NZP) and their structural analogues are characterized by a peculiar behaviors on heating – they have different expansion and contraction along different crystallographic directions due to specific arrangements of crystal structure in these compounds. An important feature of such structures is the ability to incorporate into their structural analogues wide variety of metal cations having different size and oxidation states, with different combinations and concentrations. These cations are located in different crystallographic non-equivalent positions of octahedral tetrahedral crystal framework as well as in inter-framework cavities. Through, due to iso- and hetero-valent isomorphism of the cations (and the anions) in NZP, it becomes possible to tuning the compositions and to obtain the compounds with ‘on a plan’ properties. For the design of compounds with low and ultra-low thermal expansion including those with tailored thermal expansion properties, the following crystallochemical principles it seems are promising: 1) Insertion into crystal M1 position the cations having different sizes and, 2) the variation in the composition of compounds, providing different occupation of crystal M1 position. Following these principles we have designed and synthesized the next NZP-type phosphates series: a) where radii of the cations in the M1 crystal position was varied: Zr1/4Zr2(PO4)3 - Th1/4Zr2(PO4)3 (series I); R1/3Zr2(PO4)3 where R= Nd, Eu, Er (series II), b) where the occupation of M1 crystal position was varied: Zr1/4Zr2(PO4)3-Er1/3Zr2(PO4)3 (series III) and Zr1/4Zr2(PO4)3-Sr1/2Zr2(PO4)3 (series IV). The thermal expansion parameters were determined over the range of 25-800ºC. For each series the minimum axial coefficient of thermal expansion αa = αb, αc and their anisotropy Δα = Iαa - αcI, 10-6 K-1 was found as next: -1.51, 1.07, 2.58 for Th1/4Zr2(PO4)3 (series I); -0.72, 0.10, 0.81 for Nd1/3Zr2(PO4)3 (series II); -2.78, 1.35, 4.12 for Er1/6Zr1/8Zr2(PO4)3 (series III); 2.23, 1.32, 0.91 for Sr1/2Zr2(PO4)3 (series IV). The measured tendencies of the thermal expansion of crystals were in good agreement with predicted ones. For one of the members from the studied phosphates namely Th1/16Zr3/16Zr2(PO4)3 structural refinement have been carried out at 25, 200, 600, and 800°C. The dependencies of the structural parameters with the temperature have been determined.

Keywords: high-temperature crystallography, NaZr2(PO4)3, (NZP) analogs, structural-chemical principles, tuning thermal expansion

Procedia PDF Downloads 215
538 Cross-Tier Collaboration between Preservice and Inservice Language Teachers in Designing Online Video-Based Pragmatic Assessment

Authors: Mei-Hui Liu

Abstract:

This paper reports the progression of language teachers’ learning to assess students’ speech act performance via online videos in a cross-tier professional growth community. This yearlong research project collected multiple data sources from several stakeholders, including 12 preservice and 4 inservice English as a foreign language (EFL) teachers, 4 English professionals, and 82 high school students. Data sources included surveys, (focus group) interviews, online reflection journals, online video-based assessment items/scores, and artifacts related to teacher professional learning. The major findings depicted the effectiveness of this proposed learning module on language teacher development in pragmatic assessment as well as its impact on student learning experience. All these teachers appreciated this professional learning experience which enhanced their knowledge in assessing students’ pragmalinguistic and sociopragmatic performance in an English speech act (i.e., making refusals). They learned how to design online video-based assessment items by attending to specific linguistic structures, semantic formula, and sociocultural issues. They further became aware of how to sharpen pragmatic instructional skills in the near future after putting theories into online assessment and related classroom practices. Additionally, data analysis revealed students’ achievement in and satisfaction with the designed online assessment. Yet, during the professional learning process most participating teachers encountered challenges in reaching a consensus on selecting appropriate video clips from available sources to present the sociocultural values in English-speaking refusal contexts. Also included was to construct test items which could testify the influence of interlanguage transfer on students’ pragmatic performance in various conversational scenarios. With pedagogical implications and research suggestions, this study adds to the increasing amount of research into integrating preservice and inservice EFL teacher education in pragmatic assessment and relevant instruction. Acknowledgment: This research project is sponsored by the Ministry of Science and Technology in the Republic of China under the grant number of MOST 106-2410-H-029-038.

Keywords: cross-tier professional development, inservice EFL teachers, pragmatic assessment, preservice EFL teachers, student learning experience

Procedia PDF Downloads 234
537 An Assessment of Inland Transport Operator's Competitiveness in Phnom Penh, Cambodia

Authors: Savin Phoeun

Abstract:

Long time civil war, economic, infrastructure, social, and political structure were destroyed and everything starts from zero. Transport and communication are the key feature of the national economic growth, especially inland transport and other mode take a complementary role which supported by government and international organization both direct and indirect to private sector and small and medium size enterprises. The objectives of this study are to study the general characteristics, capacity and competitive KPIs of Cambodian Inland Transport Operators. Questionnaire and interview were formed from capacity and competitiveness key performance indicators to take apart in survey to Inland Transport Companies in Phnom Penh capital city of Cambodia. And descriptive statistics was applied to identify the data. The result of this study divided into three distinct sectors: 1). Management ability of transport operators – capital management, financial and qualification are in similar level which can compete between local competitors (moderated level). 2). Ability in operation: customer service providing is better but seemed in high cost operation because mostly they are in family size. 3). Local Cambodian Inland Transport Service Providers are able to compete with each other because they are in similar operation level while Thai competitors mostly higher than. The suggestion and recommendation from the result that inland transport companies should access to new technology, improve strategic management, build partnership (join/corporate) to be bigger size of capital and company in order to attract truthfulness from customers and customize the services to satisfy. Inland Service Providers should change characteristic from only cost competitive to cost saving and service enhancement.

Keywords: assessment, competitiveness, inland transport, operator

Procedia PDF Downloads 243
536 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 299
535 The Automatisation of Dictionary-Based Annotation in a Parallel Corpus of Old English

Authors: Ana Elvira Ojanguren Lopez, Javier Martin Arista

Abstract:

The aims of this paper are to present the automatisation procedure adopted in the implementation of a parallel corpus of Old English, as well as, to assess the progress of automatisation with respect to tagging, annotation, and lemmatisation. The corpus consists of an aligned parallel text with word-for-word comparison Old English-English that provides the Old English segment with inflectional form tagging (gloss, lemma, category, and inflection) and lemma annotation (spelling, meaning, inflectional class, paradigm, word-formation and secondary sources). This parallel corpus is intended to fill a gap in the field of Old English, in which no parallel and/or lemmatised corpora are available, while the average amount of corpus annotation is low. With this background, this presentation has two main parts. The first part, which focuses on tagging and annotation, selects the layouts and fields of lexical databases that are relevant for these tasks. Most information used for the annotation of the corpus can be retrieved from the lexical and morphological database Nerthus and the database of secondary sources Freya. These are the sources of linguistic and metalinguistic information that will be used for the annotation of the lemmas of the corpus, including morphological and semantic aspects as well as the references to the secondary sources that deal with the lemmas in question. Although substantially adapted and re-interpreted, the lemmatised part of these databases draws on the standard dictionaries of Old English, including The Student's Dictionary of Anglo-Saxon, An Anglo-Saxon Dictionary, and A Concise Anglo-Saxon Dictionary. The second part of this paper deals with lemmatisation. It presents the lemmatiser Norna, which has been implemented on Filemaker software. It is based on a concordance and an index to the Dictionary of Old English Corpus, which comprises around three thousand texts and three million words. In its present state, the lemmatiser Norna can assign lemma to around 80% of textual forms on an automatic basis, by searching the index and the concordance for prefixes, stems and inflectional endings. The conclusions of this presentation insist on the limits of the automatisation of dictionary-based annotation in a parallel corpus. While the tagging and annotation are largely automatic even at the present stage, the automatisation of alignment is pending for future research. Lemmatisation and morphological tagging are expected to be fully automatic in the near future, once the database of secondary sources Freya and the lemmatiser Norna have been completed.

Keywords: corpus linguistics, historical linguistics, old English, parallel corpus

Procedia PDF Downloads 180
534 MRI Quality Control Using Texture Analysis and Spatial Metrics

Authors: Kumar Kanudkuri, A. Sandhya

Abstract:

Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.

Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy

Procedia PDF Downloads 138
533 Analyzing the Influence of Hydrometeorlogical Extremes, Geological Setting, and Social Demographic on Public Health

Authors: Irfan Ahmad Afip

Abstract:

This main research objective is to accurately identify the possibility for a Leptospirosis outbreak severity of a certain area based on its input features into a multivariate regression model. The research question is the possibility of an outbreak in a specific area being influenced by this feature, such as social demographics and hydrometeorological extremes. If the occurrence of an outbreak is being subjected to these features, then the epidemic severity for an area will be different depending on its environmental setting because the features will influence the possibility and severity of an outbreak. Specifically, this research objective was three-fold, namely: (a) to identify the relevant multivariate features and visualize the patterns data, (b) to develop a multivariate regression model based from the selected features and determine the possibility for Leptospirosis outbreak in an area, and (c) to compare the predictive ability of multivariate regression model and machine learning algorithms. Several secondary data features were collected locations in the state of Negeri Sembilan, Malaysia, based on the possibility it would be relevant to determine the outbreak severity in the area. The relevant features then will become an input in a multivariate regression model; a linear regression model is a simple and quick solution for creating prognostic capabilities. A multivariate regression model has proven more precise prognostic capabilities than univariate models. The expected outcome from this research is to establish a correlation between the features of social demographic and hydrometeorological with Leptospirosis bacteria; it will also become a contributor for understanding the underlying relationship between the pathogen and the ecosystem. The relationship established can be beneficial for the health department or urban planner to inspect and prepare for future outcomes in event detection and system health monitoring.

Keywords: geographical information system, hydrometeorological, leptospirosis, multivariate regression

Procedia PDF Downloads 90
532 The Grammar of the Content Plane as a Style Marker in Forensic Authorship Attribution

Authors: Dayane de Almeida

Abstract:

This work aims at presenting a study that demonstrates the usability of categories of analysis from Discourse Semiotics – also known as Greimassian Semiotics in authorship cases in forensic contexts. It is necessary to know if the categories examined in semiotic analysis (the ‘grammar’ of the content plane) can distinguish authors. Thus, a study with 4 sets of texts from a corpus of ‘not on demand’ written samples (those texts differ in formality degree, purpose, addressees, themes, etc.) was performed. Each author contributed with 20 texts, separated into 2 groups of 10 (Author1A, Author1B, and so on). The hypothesis was that texts from a single author were semiotically more similar to each other than texts from different authors. The assumptions and issues that led to this idea are as follows: -The features analyzed in authorship studies mostly relate to the expression plane: they are manifested on the ‘surface’ of texts. If language is both expression and content, content would also have to be considered for more accurate results. Style is present in both planes. -Semiotics postulates the content plane is structured in a ‘grammar’ that underlies expression, and that presents different levels of abstraction. This ‘grammar’ would be a style marker. -Sociolinguistics demonstrates intra-speaker variation: an individual employs different linguistic uses in different situations. Then, how to determine if someone is the author of several texts, distinct in nature (as it is the case in most forensic sets), when it is known intra-speaker variation is dependent on so many factors?-The idea is that the more abstract the level in the content plane, the lower the intra-speaker variation, because there will be a greater chance for the author to choose the same thing. If two authors recurrently chose the same options, differently from one another, it means each one’s option has discriminatory power. -Size is another issue for various attribution methods. Since most texts in real forensic settings are short, methods relying only on the expression plane tend to fail. The analysis of the content plane as proposed by greimassian semiotics would be less size-dependable. -The semiotic analysis was performed using the software Corpus Tool, generating tags to allow the counting of data. Then, similarities and differences were quantitatively measured, through the application of the Jaccard coefficient (a statistical measure that compares the similarities and differences between samples). The results showed the hypothesis was confirmed and, hence, the grammatical categories of the content plane may successfully be used in questioned authorship scenarios.

Keywords: authorship attribution, content plane, forensic linguistics, greimassian semiotics, intraspeaker variation, style

Procedia PDF Downloads 222
531 Virtue, Truth, Freedom, And The History Of Philosophy

Authors: Ashley DelCorno

Abstract:

GEM Anscombe’s 1958 essay Modern Moral Philosophy and the tradition of virtue ethics that followed has given rise to the restoration (or, more plainly, the resurrection) of Aristotle as something of an authority figure. Alisdair MacIntyre and Martha Nussbaum are proponents, for example, not just of Aristotle’s relevancy but also of his apparent implicit authority. That said, it’s not clear that the schema imagined by virtue ethicists accurately describes moral life or that it does not inadvertently work to impoverish genuine decision-making. If the label ‘virtue’ is categorically denied to some groups (while arbitrarily afforded to others), it can only turn on itself, thus rendering ridiculous its own premise. Likewise, as an inescapable feature of virtue ethics, Aristotelean binaries like ‘virtue/vice’ and ‘voluntary/involuntary’ offer up false dichotomies that may seriously compromise an agent’s ability to conceptualize choices that are truly free and rooted in meaningful criteria. Here, this topic is analyzed through a feminist lens predicated on the known paradoxes of patriarchy. The work of feminist theorists Jacqui Alexander, Katharine Angel, Simone de Beauvoir, bell hooks, Audre Lorde, Imani Perry, and Amia Srinivasan serves as important guideposts, and the argument here is built from a key tenet of black feminist thought regarding scarcity and possibility. Above all, it’s clear that though the philosophical tradition of virtue ethics presents itself as recovering the place of agency in ethics, its premises possess crippling limitations toward the achievement of this goal. These include, most notably, virtue ethics’ binding analysis of history, as well as its axiomatic attachment to obligatory clauses, problematic reading-in of Aristotle and arbitrary commitment to predetermined and competitively patriarchal ideas of what counts as a virtue.

Keywords: feminist history, the limits of utopic imagination, curatorial creation, truth, virtue, freedom

Procedia PDF Downloads 68
530 The Suitability of Agile Practices in Healthcare Industry with Regard to Healthcare Regulations

Authors: Mahmood Alsaadi, Alexei Lisitsa

Abstract:

Nowadays, medical devices rely completely on software whether as whole software or as embedded software, therefore, the organization that develops medical device software can benefit from adopting agile practices. Using agile practices in healthcare software development industries would bring benefits such as producing a product of a high-quality with low cost and in short period. However, medical device software development companies faced challenges in adopting agile practices. These due to the gaps that exist between agile practices and the requirements of healthcare regulations such as documentation, traceability, and formality. This research paper will conduct a study to investigate the adoption rate of agile practice in medical device software development, and they will extract and outline the requirements of healthcare regulations such as Food and Drug Administration (FDA), Health Insurance Portability and Accountability Act (HIPAA), and Medical Device Directive (MDD) that affect directly or indirectly on software development life cycle. Moreover, this research paper will evaluate the suitability of using agile practices in healthcare industries by analyzing the most popular agile practices such as eXtream Programming (XP), Scrum, and Feature-Driven Development (FDD) from healthcare industry point of view and in comparison with the requirements of healthcare regulations. Finally, the authors propose an agile mixture model that consists of different practices from different agile methods. As result, the adoption rate of agile practices in healthcare industries still low and agile practices should enhance with regard to requirements of the healthcare regulations in order to be used in healthcare software development organizations. Therefore, the proposed agile mixture model may assist in minimizing the gaps existing between healthcare regulations and agile practices and increase the adoption rate in the healthcare industry. As this research paper part of the ongoing project, an evaluation of agile mixture model will be conducted in the near future.

Keywords: adoption of agile, agile gaps, agile mixture model, agile practices, healthcare regulations

Procedia PDF Downloads 217
529 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection

Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine

Abstract:

Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.

Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine

Procedia PDF Downloads 244
528 A Corpus-Based Contrastive Analysis of Directive Speech Act Verbs in English and Chinese Legal Texts

Authors: Wujian Han

Abstract:

In the process of human interaction and communication, speech act verbs are considered to be the most active component and the main means for information transmission, and are also taken as an indication of the structure of linguistic behavior. The theoretical value and practical significance of such everyday built-in metalanguage have long been recognized. This paper, which is part of a bigger study, is aimed to provide useful insights for a more precise and systematic application to speech act verbs translation between English and Chinese, especially with regard to the degree to which generic integrity is maintained in the practice of translation of legal documents. In this study, the corpus, i.e. Chinese legal texts and their English translations, English legal texts, ordinary Chinese texts, and ordinary English texts, serve as a testing ground for examining contrastively the usage of English and Chinese directive speech act verbs in legal genre. The scope of this paper is relatively wide and essentially covers all directive speech act verbs which are used in ordinary English and Chinese, such as order, command, request, prohibit, threat, advice, warn and permit. The researcher, by combining the corpus methodology with a contrastive perspective, explored a range of characteristics of English and Chinese directive speech act verbs including their semantic, syntactic and pragmatic features, and then contrasted them in a structured way. It has been found that there are similarities between English and Chinese directive speech act verbs in legal genre, such as similar semantic components between English speech act verbs and their translation equivalents in Chinese, formal and accurate usage of English and Chinese directive speech act verbs in legal contexts. But notable differences have been identified in areas of difference between their usage in the original Chinese and English legal texts such as valency patterns and frequency of occurrences. For example, the subjects of some directive speech act verbs are very frequently omitted in Chinese legal texts, but this is not the case in English legal texts. One of the practicable methods to achieve adequacy and conciseness in speech act verb translation from Chinese into English in legal genre is to repeat the subjects or the message with discrepancy, and vice versa. In addition, translation effects such as overuse and underuse of certain directive speech act verbs are also found in the translated English texts compared to the original English texts. Legal texts constitute a particularly valuable material for speech act verb study. Building up such a contrastive picture of the Chinese and English speech act verbs in legal language would yield results of value and interest to legal translators and students of language for legal purposes and have practical application to legal translation between English and Chinese.

Keywords: contrastive analysis, corpus-based, directive speech act verbs, legal texts, translation between English and Chinese

Procedia PDF Downloads 468
527 Multi-Impairment Compensation Based Deep Neural Networks for 16-QAM Coherent Optical Orthogonal Frequency Division Multiplexing System

Authors: Ying Han, Yuanxiang Chen, Yongtao Huang, Jia Fu, Kaile Li, Shangjing Lin, Jianguo Yu

Abstract:

In long-haul and high-speed optical transmission system, the orthogonal frequency division multiplexing (OFDM) signal suffers various linear and non-linear impairments. In recent years, researchers have proposed compensation schemes for specific impairment, and the effects are remarkable. However, different impairment compensation algorithms have caused an increase in transmission delay. With the widespread application of deep neural networks (DNN) in communication, multi-impairment compensation based on DNN will be a promising scheme. In this paper, we propose and apply DNN to compensate multi-impairment of 16-QAM coherent optical OFDM signal, thereby improving the performance of the transmission system. The trained DNN models are applied in the offline digital signal processing (DSP) module of the transmission system. The models can optimize the constellation mapping signals at the transmitter and compensate multi-impairment of the OFDM decoded signal at the receiver. Furthermore, the models reduce the peak to average power ratio (PAPR) of the transmitted OFDM signal and the bit error rate (BER) of the received signal. We verify the effectiveness of the proposed scheme for 16-QAM Coherent Optical OFDM signal and demonstrate and analyze transmission performance in different transmission scenarios. The experimental results show that the PAPR and BER of the transmission system are significantly reduced after using the trained DNN. It shows that the DNN with specific loss function and network structure can optimize the transmitted signal and learn the channel feature and compensate for multi-impairment in fiber transmission effectively.

Keywords: coherent optical OFDM, deep neural network, multi-impairment compensation, optical transmission

Procedia PDF Downloads 122
526 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis

Authors: Wenbo Du, Xiaomei Ma

Abstract:

With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.

Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression

Procedia PDF Downloads 132
525 The Meaning System of Tense: A Systemic Functional Approach

Authors: Cunyu Zhang

Abstract:

Through literature review about studies related to tense, it is found that there exist disagreements on the definition and existence of Chinese tense. Influenced by some researches on English language which regard tense as a grammatical category based on the verbal inflections of English, some Chinese researchers claim that there is no tense in Chinese language as there are no verbal inflections involved. Meanwhile, other Chinese researchers hold that Chinese still has tense although its verbs are non-inflectional based on the fact that Chinese lexical expressions can imply temporal meaning. We assume that the reasons for the above disagreements in terms of Chinese tense lie in the fact that all the previous studies prefer to view language “from the below” which means expressions of tense are the core part of these studies. However, there are about 6,000 languages with distinct expressions all over the world. Hence, if the language studies only concentrate on expressions, it must become more difficult to understand the nature of language. By contrast, functions of languages are similar; otherwise, the human beings could not communicate with each other. Therefore, we believe that it is necessary for us to have a theoretical study on Chinese tense within the framework of SFL which holds that language is a system where meaning is the core part while form is just the realization of meaning. In addition, SFL is a general linguistic providing a universal framework for languages all over the world. Therefore, based on Systemic Functional Linguistics, the paper firstly redefines tense as a deictic semantic category for describing the speaker’s temporal location of processes and relevant temporal relations. With reference to this definition, this study explores the meaning system of tense. It is proposed that tense expresses four kinds of meaning, namely interpersonal, experiential, logical and textual meanings. From the interpersonal angle, tense helps to exchange temporal information between the speaker and the listener, and the temporal information refers to the anchoring of a concerned process in the past, present or future by the speaker. From the experiential angle, tense plays a role in the temporal locating of material, mental, relational, existential, behavioral and verbal processes by the speaker. From the logical angle, tense denotes the temporal relations at the two levels of clause and clause complex, and such relations fall into simultaneity, anteriority and posteriority. From the textual angle, tense refers to the temporal relations at the level of text, and the temporal relations in question concern linear serial relations and synchronous serial relations.

Keywords: Chinese, meaning system, Systemic Functional Linguistics, tense

Procedia PDF Downloads 398
524 Effects of Bedside Rehabilitation of Stroke Patients in Activities and Daily Living Function

Authors: Chiung-Hua Chan, Fang-Yuan Chang, Li-Chi Huang

Abstract:

Stroke patients received regular rehabilitation therapy have measurable advancement in muscle strength, balance, control upper and lower physical activity, walking speed and endurance. This study aimed to investigate the relationship between increases in bedside rehabilitation time and the function of activities and daily living (ADL) in stroke patients. The study was quasi-experimental research design and randomized sampling. The researcher collected 12 stroke patients of stroke patients transferred to rehabilitation ward unit of a medical center during 1 January to 31 March 2017. All participants then were assigned to case group and control group. Data collection was through direct observation of assessment ADL of stroke patients by researchers on Day 1. Case group received regular rehabilitation, exercises in increase of bedside rehabilitation schedules exercise programs by ward nurses. Bedside rehabilitation exercise content with physical, functional and linguistic frequency and time, Control group only give routine rehabilitation schedule care. This was a randomized study performed in 12 patients who were stroke patients and transferred to rehabilitation ward unit of a medical center during 1 January to 31 March 2017. First, the researcher explained the purpose and method of the study to the patients or the family members. All participants completed a consent informed before participation. Patients were randomly assigned to a ‘bedside rehabilitation program’ (BRP) group and a control (C) group. The BRP group received bedside rehabilitation schedules exercise programs by ward nurses. while the C group did not. Both groups received routine rehabilitation schedule. The Functional Independence Measure was used to measure outcome at the first, 14th and the 28th day of rehabilitation ward admitted. Data were analyzed using SPSS 22.0. After implementation of standardized ‘‘bedside rehabilitation program’, the results were: (1) the increasing of bedside rehabilitation had significant difference (p<.05) in promotion ADL function of stroke patients (2) the extend time of the bedside rehabilitation has significant difference (p<.05) in promotion ADL function of stroke patients compared with the control group. This study demonstrated that the ‘bedside rehabilitation program’ enhanced the ADL function in stroke patients. The nurses and rehabilitation ward managers need to understand that the extend time and frequency of rehabilitation provide a chance to enhanced the ADL function of stroke patients.

Keywords: stroke, bedside rehabilitation, functional activity, ADL

Procedia PDF Downloads 111
523 Recurrent Neural Networks for Complex Survival Models

Authors: Pius Marthin, Nihal Ata Tutkun

Abstract:

Survival analysis has become one of the paramount procedures in the modeling of time-to-event data. When we encounter complex survival problems, the traditional approach remains limited in accounting for the complex correlational structure between the covariates and the outcome due to the strong assumptions that limit the inference and prediction ability of the resulting models. Several studies exist on the deep learning approach to survival modeling; moreover, the application for the case of complex survival problems still needs to be improved. In addition, the existing models need to address the data structure's complexity fully and are subject to noise and redundant information. In this study, we design a deep learning technique (CmpXRnnSurv_AE) that obliterates the limitations imposed by traditional approaches and addresses the above issues to jointly predict the risk-specific probabilities and survival function for recurrent events with competing risks. We introduce the component termed Risks Information Weights (RIW) as an attention mechanism to compute the weighted cumulative incidence function (WCIF) and an external auto-encoder (ExternalAE) as a feature selector to extract complex characteristics among the set of covariates responsible for the cause-specific events. We train our model using synthetic and real data sets and employ the appropriate metrics for complex survival models for evaluation. As benchmarks, we selected both traditional and machine learning models and our model demonstrates better performance across all datasets.

Keywords: cumulative incidence function (CIF), risk information weight (RIW), autoencoders (AE), survival analysis, recurrent events with competing risks, recurrent neural networks (RNN), long short-term memory (LSTM), self-attention, multilayers perceptrons (MLPs)

Procedia PDF Downloads 68
522 Using Genre Analysis to Teach Contract Negotiation Discourse Practices

Authors: Anthony Townley

Abstract:

Contract negotiation is fundamental to commercial law practice. For this study, genre and discourse analytical methodology was used to examine the legal negotiation of a Merger & Acquisition (M&A) deal undertaken by legal and business professionals in English across different jurisdictions in Europe. While some of the most delicate negotiations involved in this process were carried on face-to-face or over the telephone, these were generally progressed more systematically – and on the record – in the form of emails, email attachments, and as comments and amendments recorded in successive ‘marked-up’ versions of the contracts under negotiation. This large corpus of textual data was originally obtained by the author, in 2012, for the purpose of doctoral research. For this study, the analysis is particularly concerned with the use of emails and covering letters to exchange legal advice about the negotiations. These two genres help to stabilize and progress the negotiation process and account for negotiation activities. Swalesian analysis of functional Moves and Steps was able to identify structural similarities and differences between these text types and to identify certain salient discursive features within them. The analytical findings also indicate how particular linguistic strategies are more appropriately and more effectively associated with one legal genre rather than another. The concept of intertextuality is an important dimension of contract negotiation discourse and this study also examined how the discursive relationships between the different texts influence the way that texts are constructed. In terms of materials development, the research findings can contribute to more authentic English for Legal & Business Purposes pedagogies for students and novice lawyers and business professionals. The findings can first be used to design discursive maps that provide learners with a coherent account of the intertextual nature of the contract negotiation process. These discursive maps can then function as a framework in which to present detailed findings about the textual and structural features of the text types by applying the Swalesian genre analysis. Based on this acquired knowledge of the textual nature of contract negotiation, the authentic discourse materials can then be used to provide learners with practical opportunities to role-play negotiation activities and experience professional ways of thinking and using language in preparation for the written discourse challenges they will face in this important area of legal and business practice.

Keywords: English for legal and business purposes, discourse analysis, genre analysis, intertextuality, pedagogical materials

Procedia PDF Downloads 124
521 The Feasibility Evaluation Of The Compressed Air Energy Storage System In The Porous Media Reservoir

Authors: Ming-Hong Chen

Abstract:

In the study, the mechanical and financial feasibility for the compressed air energy storage (CAES) system in the porous media reservoir in Taiwan is evaluated. In 2035, Taiwan aims to install 16.7 GW of wind power and 40 GW of photovoltaic (PV) capacity. However, renewable energy sources often generate more electricity than needed, particularly during winter. Consequently, Taiwan requires long-term, large-scale energy storage systems to ensure the security and stability of its power grid. Currently, the primary large-scale energy storage options are Pumped Hydro Storage (PHS) and Compressed Air Energy Storage (CAES). Taiwan has not ventured into CAES-related technologies due to geological and cost constraints. However, with the imperative of achieving net-zero carbon emissions by 2050, there's a substantial need for the development of a considerable amount of renewable energy. PHS has matured, boasting an overall installed capacity of 4.68 GW. CAES, presenting a similar scale and power generation duration to PHS, is now under consideration. Taiwan's geological composition, being a porous medium unlike salt caves, introduces flow field resistance affecting gas injection and extraction. This study employs a program analysis model to establish the system performance analysis capabilities of CAES. The finite volume model is then used to assess the impact of porous media, and the findings are fed back into the system performance analysis for correction. Subsequently, the financial implications are calculated and compared with existing literature. For Taiwan, the strategic development of CAES technology is crucial, not only for meeting energy needs but also for decentralizing energy allocation, a feature of great significance in regions lacking alternative natural resources.

Keywords: compressed-air energy storage, efficiency, porous media, financial feasibility

Procedia PDF Downloads 51
520 The Grievances Theory versus Transnationalism and the Cameroon Anglophone Question, 1961-2017

Authors: Nkatow Mafany Christian

Abstract:

No other period in human history has offered such great opportunities for grievances not only to last long but also to be manifested across international boundaries. This state of affairs is likely a common feature of the advent of social media. The Anglophone Question in Cameroon has been a problem of poor constitutional arrangements that can be traced to 1961 when the former French Cameroon reunified with former British Southern Cameroons following a plebiscite in which the latter overwhelmingly voted to reunify with the former. Though Southern/Anglophone Cameroons complained of perceived marginalization and an attempt by the majority French section to assimilate them, the manifestation was subtle and took place only through protests, petitions, strikes movements and demonstrations. However, with the advent of social media, a new cream of leaders emerged in the diaspora, including the US, Canada, Europe, Asia and the Middle East, to champion the manifestations leading to violence and conflicts that have bedeviled the region since 2017. The feeling of political subjugation, economic exploitation, social suppression and cultural assimilation among Anglophone Cameroonians united them under diaspora leaders against the government of Cameroon, calling for the creation of a separate state for Anglophones. This paper draws from this lead-up to analyze the current Anglophone Crisis in Cameroon in the light of the Grievance Theory and Transnationalism. The paper makes an appeal to field experience, interviews, official sources, documentation, and the internet to succor its central thesis. From the fertility of its sources, the paper submits that social media is a potent source of conflicts and makes nonsense of the principle of sovereignty and territorial integrity by its capacity to promote the transnational manifestation of grievances.

Keywords: grievance, transnationalism, anglophone crisis, Cameroon, crisis and social media

Procedia PDF Downloads 47
519 Water Quality Calculation and Management System

Authors: H. M. B. N Jayasinghe

Abstract:

The water is found almost everywhere on Earth. Water resources contain a lot of pollution. Some diseases can be spread through the water to the living beings. So to be clean water it should undergo a number of treatments necessary to make it drinkable. So it is must to have purification technology for the wastewater. So the waste water treatment plants act a major role in these issues. When considering the procedures taken after the water treatment process was always based on manual calculations and recordings. Water purification plants may interact with lots of manual processes. It means the process taking much time consuming. So the final evaluation and chemical, biological treatment process get delayed. So to prevent those types of drawbacks there are some computerized programmable calculation and analytical techniques going to be introduced to the laboratory staff. To solve this problem automated system will be a solution in which guarantees the rational selection. A decision support system is a way to model data and make quality decisions based upon it. It is widely used in the world for the various kind of process automation. Decision support systems that just collect data and organize it effectively are usually called passive models where they do not suggest a specific decision but only reveal information. This web base system is based on global positioning data adding facility with map location. Most worth feature is SMS and E-mail alert service to inform the appropriate person on a critical issue. The technological influence to the system is HTML, MySQL, PHP, and some other web developing technologies. Current issues in the computerized water chemistry analysis are not much deep in progress. For an example the swimming pool water quality calculator. The validity of the system has been verified by test running and comparison with an existing plant data. Automated system will make the life easier in productively and qualitatively.

Keywords: automated system, wastewater, purification technology, map location

Procedia PDF Downloads 230
518 Analysis of Solid Waste Management Practices and the Implications for Human Health and the Environment: A Case Study of Kayamandi Informal Settlement

Authors: Peter Iyobosa Asemota

Abstract:

This study on solid waste management practices addressed aspects of environmental and health impacts resulting from poor management of solid waste. The study was occasioned by the observed rate and volume of illegal and indiscriminate dumping of solid waste materials especially in informal settlements. The main focus of this study was to establish the impact of waste management practices on human health and the environment. The study, therefore, presents a critical analysis of the state of solid waste management in the study area and the implications for human health and the environment. The study was carried out in Kayamandi informal settlement within Stellenbosch municipality. The sustainable management of solid waste is very important in order to minimize the environmental and public health risks associated with improper solid waste management. There is no denying the fact that the problems of waste management will become critical as time goes on because of improper and inefficient waste management practices. Towns and cities exhibit the burdens of waste management which is a characteristics feature of most African cities. The study critically assess the implementation of waste management practices by the residents of the informal settlement; identify the factors affecting management issues in the operation of solid waste management system by the municipality; identify factors militating against the implementation of waste management policies and legislation. Furthermore, a waste assessment study was carried out to assess the generation; composition of the waste stream and also determine the attitudes and behavior of the residents with regard to waste management practices. Findings from the study revealed that Kayamandi is not different from other informal settlements with regards to waste management. People are of the opinion that solid waste management is the sole responsibility of municipal authorities and as such, the government should be responsible for bearing the cost of solid waste management.

Keywords: environment, waste, waste composition, waste stream, policy, waste categories, sanitary landfill, waste collection, integrated solid waste management

Procedia PDF Downloads 668
517 Spatio-Temporal Analysis of Drought in Cholistan Region, Pakistan: An Application of Standardized Precipitation Index

Authors: Qurratulain Safdar

Abstract:

Drought is a temporary aberration in contrast to aridity, as it is a permanent feature of climate. Virtually, it takes place in all types of climatic regions that range from high to low rainfall areas. Due to the wide latitudinal extent of Pakistan, there is seasonal and annual variability in rainfall. The south-central part of the country is arid and hyper-arid. This study focuses on the spatio-temporal analysis of droughts in arid and hyperarid region of Cholistan using the standardized precipitation index (SPI) approach. This study has assessed the extent of recurrences of drought and its temporal vulnerability to drought in Cholistan region. Initially, the paper described the geographic setup of the study area along with a brief description of the drought conditions that prevail in Pakistan. The study also provides a scientific foundation for preparing literature and theoretical framework in-line with the selected parameters and indicators. Data were collected both from primary and secondary data sources. Rainfall and temperature data were obtained from Pakistan Meteorology Department. By applying geostatistical approach, a standardized precipitation index (SPI) was calculated for the study region, and the value of spatio-temporal variability of drought and its severity was explored. As a result, in-depth spatial analysis of drought conditions in Cholistan area was found. Parallel to this, drought-prone areas with seasonal variation were also identified using Kriging spatial interpolation techniques in a GIS environment. The study revealed that there is temporal variation in droughts' occurrences both in time series and SPI values. The paper is finally concluded, and strategic plan was suggested to minimize the impacts of drought.

Keywords: Cholistan desert, climate anomalies, metrological droughts, standardized precipitation index

Procedia PDF Downloads 180
516 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification

Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine

Abstract:

Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.

Keywords: convolution, feature extraction, image analysis, validation, precision agriculture

Procedia PDF Downloads 294
515 Numerical Analysis of Laminar Reflux Condensation from Gas-Vapour Mixtures in Vertical Parallel Plate Channels

Authors: Foad Hassaninejadafarahani, Scott Ormiston

Abstract:

Reflux condensation occurs in a vertical channels and tubes when there is an upward core flow of vapor (or gas-vapor mixture) and a downward flow of the liquid film. The understanding of this condensation configuration is crucial in the design of reflux condensers, distillation columns, and in loss-of-coolant safety analyses in nuclear power plant steam generators. The unique feature of this flow is the upward flow of the vapor-gas mixture (or pure vapor) that retards the liquid flow via shear at the liquid-mixture interface. The present model solves the full, elliptic governing equations in both the film and the gas-vapor core flow. The computational mesh is non-orthogonal and adapts dynamically the phase interface, thus produces sharp and accurate interface. Shear forces and heat and mass transfer at the interface are accounted for fundamentally. This modeling is a big step ahead of current capabilities by removing the limitations of previous reflux condensation models which inherently cannot account for the detailed local balances of shear, mass, and heat transfer at the interface. Discretisation has been done based on a finite volume method and a co-located variable storage scheme. An in-house computer code was developed to implement the numerical solution scheme. Detailed results are presented for laminar reflux condensation from steam-air mixtures flowing in vertical parallel plate channels. The results include velocity and pressure profiles, as well as axial variations of film thickness, Nusselt number and interface gas mass fraction.

Keywords: Reflux, Condensation, CFD-Two Phase, Nusselt number

Procedia PDF Downloads 345
514 Changes in the Median Sacral Crest Associated with Sacrocaudal Fusion in the Greyhound

Authors: S. M. Ismail, H-H Yen, C. M. Murray, H. M. S. Davies

Abstract:

A recent study reported a 33% incidence of complete sacrocaudal fusion in greyhounds compared to a 3% incidence in other dogs. In the dog, the median sacral crest is formed by the fusion of sacral spinous processes. Separation of the 1st spinous process from the median crest of the sacrum in the dog has been reported as a diagnostic tool of type one lumbosacral transitional vertebra (LTV). LTV is a congenital spinal anomaly, which includes either sacralization of the caudal lumbar part or lumbarization of the most cranial sacral segment of the spine. In this study, the absence or reduction of fusion (presence of separation) between the 1st and 2ndspinous processes of the median sacral crest has been identified in association with sacrocaudal fusion in the greyhound, without any feature of LTV. In order to provide quantitative data on the absence or reduction of fusion in the median sacral crest between the 1st and 2nd sacral spinous processes, in association with sacrocaudal fusion. 204 dog sacrums free of any pathological changes (192 greyhound, 9 beagles and 3 labradors) were grouped based on the occurrence and types of fusion and the presence, absence, or reduction in the median sacral crest between the 1st and 2nd sacral spinous processes., Sacrums were described and classified as follows: F: Complete fusion (crest is present), N: Absence (fusion is absent), and R: Short crest (fusion reduced but not absent (reduction). The incidence of sacrocaudal fusion in the 204 sacrums: 57% of the sacrums were standard (3 vertebrae) and 43% were fused (4 vertebrae). Type of sacrum had a significant (p < .05) association with the absence and reduction of fusion between the 1st and 2nd sacral spinous processes of the median sacral crest. In the 108 greyhounds with standard sacrums (3 vertebrae) the percentages of F, N and R were 45% 23% and 23% respectively, while in the 84 fused (4 vertebrae) sacrums, the percentages of F, N and R were 3%, 87% and 10% respectively and these percentages were significantly different between standard (3 vertebrae) and fused (4 vertebrae) sacrums (p < .05). This indicates that absence of spinous process fusion in the median sacral crest was found in a large percentage of the greyhounds in this study and was found to be particularly prevalent in those with sacrocaudal fusion – therefore in this breed, at least, absence of sacral spinous process fusion may be unlikely to be associated with LTV.

Keywords: greyhound, median sacral crest, sacrocaudal fusion, sacral spinous process

Procedia PDF Downloads 423
513 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany

Authors: Bara' Al-Mistarehi

Abstract:

Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.

Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure

Procedia PDF Downloads 158
512 Thick Data Techniques for Identifying Abnormality in Video Frames for Wireless Capsule Endoscopy

Authors: Jinan Fiaidhi, Sabah Mohammed, Petros Zezos

Abstract:

Capsule endoscopy (CE) is an established noninvasive diagnostic modality in investigating small bowel disease. CE has a pivotal role in assessing patients with suspected bleeding or identifying evidence of active Crohn's disease in the small bowel. However, CE produces lengthy videos with at least eighty thousand frames, with a frequency rate of 2 frames per second. Gastroenterologists cannot dedicate 8 to 15 hours to reading the CE video frames to arrive at a diagnosis. This is why the issue of analyzing CE videos based on modern artificial intelligence techniques becomes a necessity. However, machine learning, including deep learning, has failed to report robust results because of the lack of large samples to train its neural nets. In this paper, we are describing a thick data approach that learns from a few anchor images. We are using sound datasets like KVASIR and CrohnIPI to filter candidate frames that include interesting anomalies in any CE video. We are identifying candidate frames based on feature extraction to provide representative measures of the anomaly, like the size of the anomaly and the color contrast compared to the image background, and later feed these features to a decision tree that can classify the candidate frames as having a condition like the Crohn's Disease. Our thick data approach reported accuracy of detecting Crohn's Disease based on the availability of ulcer areas at the candidate frames for KVASIR was 89.9% and for the CrohnIPI was 83.3%. We are continuing our research to fine-tune our approach by adding more thick data methods for enhancing diagnosis accuracy.

Keywords: thick data analytics, capsule endoscopy, Crohn’s disease, siamese neural network, decision tree

Procedia PDF Downloads 125