Search results for: multi disciplinary analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30380

Search results for: multi disciplinary analysis

29060 Performance of Coded Multi-Line Copper Wire for G.fast Communications in the Presence of Impulsive Noise

Authors: Israa Al-Neami, Ali J. Al-Askery, Martin Johnston, Charalampos Tsimenidis

Abstract:

In this paper, we focus on the design of a multi-line copper wire (MLCW) communication system. First, we construct our proposed MLCW channel and verify its characteristics based on the Kolmogorov-Smirnov test. In addition, we apply Middleton class A impulsive noise (IN) to the copper channel for further investigation. Second, the MIMO G.fast system is adopted utilizing the proposed MLCW channel model and is compared to a single line G-fast system. Second, the performance of the coded system is obtained utilizing concatenated interleaved Reed-Solomon (RS) code with four-dimensional trellis-coded modulation (4D TCM), and compared to the single line G-fast system. Simulations are obtained for high quadrature amplitude modulation (QAM) constellations that are commonly used with G-fast communications, the results demonstrate that the bit error rate (BER) performance of the coded MLCW system shows an improvement compared to the single line G-fast systems.

Keywords: G.fast, Middleton Class A impulsive noise, mitigation techniques, Copper channel model

Procedia PDF Downloads 130
29059 Optimization of SWL Algorithms Using Alternative Adder Module in FPGA

Authors: Tayab D. Memon, Shahji Farooque, Marvi Deshi, Imtiaz Hussain Kalwar, B. S. Chowdhry

Abstract:

Recently single-bit ternary FIR-like filter (SBTFF) hardware synthesize in FPGA is reported and compared with multi-bit FIR filter on similar spectral characteristics. Results shows that SBTFF dominates upon multi-bit filter overall. In this paper, an optimized adder module for ternary quantized sigma-delta modulated signal is presented. The adder is simulated using ModelSim for functional verification the area-performance of the proposed adder were obtained through synthesis in Xilinx and compared to conventional adder trees. The synthesis results show that the proposed adder tree achieves higher clock rates and lower chip area at higher inputs to the adder block; whereas conventional adder tree achieves better performance and lower chip area at lower number of inputs to the same adder block. These results enhance the usefulness of existing short word length DSP algorithms for fast and efficient mobile communication.

Keywords: short word length (SWL), DSP algorithms, FPGA, SBTFF, VHDL

Procedia PDF Downloads 342
29058 Thermal Property of Multi-Walled-Carbon-Nanotube Reinforced Epoxy Composites

Authors: Min Ye Koo, Gyo Woo Lee

Abstract:

In this study, epoxy composite specimens reinforced with multi-walled carbon nanotube filler were fabricated using shear mixer and ultra-sonication processor. The mechanical and thermal properties of the fabricated specimens were measured and evaluated. From the electron microscope images and the results from the measurements of tensile strengths, the specimens having 0.6 wt% nanotube content show better dispersion and higher strength than those of the other specimens. The Young’s moduli of the specimens increased as the contents of the nanotube filler in the matrix were increased. The specimen having a 0.6 wt% nanotube filler content showed higher thermal conductivity than that of the other specimens. While, in the measurement of thermal expansion, specimens having 0.4 and 0.6 wt% filler contents showed a lower value of thermal expansion than that of the other specimens. On the basis of the measured and evaluated properties of the composites, we believe that the simple and time-saving fabrication process used in this study was sufficient to obtain improved properties of the specimens.

Keywords: carbon nanotube filler, epoxy composite, ultra-sonication, shear mixer, mechanical property, thermal property

Procedia PDF Downloads 366
29057 Research on the Evaluation and Delineation of Value Units of New Industrial Parks Based on Implementation-Orientation

Authors: Chengfang Wang, Zichao Wu, Jianying Zhou

Abstract:

At present, much attention is paid to the development of new industrial parks in the era of inventory planning. Generally speaking, there are two types of development models: incremental development models and stock development models. The former relies on key projects to build a value innovation park, and the latter relies on the iterative update of the park to build a value innovation park. Take the Baiyun Western Digital Park as an example, considering the growth model of value units, determine the evaluation target. Based on a GIS platform, comprehensive land-use status, regulatory detailed planning, land use planning, blue-green ecological base, rail transit system, road network system, industrial park distribution, public service facilities, and other factors are used to carry out the land use within the planning multi-factor superimposed comprehensive evaluation, constructing a value unit evaluation system, and delineating value units based on implementation orientation and combining two different development models. The research hopes to provide a reference for the planning and construction of new domestic industrial parks.

Keywords: value units, GIS, multi-factor evaluation, implementation orientation

Procedia PDF Downloads 184
29056 Integrating Generic Skills into Disciplinary Curricula

Authors: Sitalakshmi Venkatraman, Fiona Wahr, Anthony de Souza-Daw, Samuel Kaspi

Abstract:

There is a growing emphasis on generic skills in higher education to match the changing skill-set requirements of the labour market. However, researchers and policy makers have not arrived at a consensus on the generic skills that actually contribute towards workplace employability and performance that complement and/or underpin discipline-specific graduate attributes. In order to strengthen the qualifications framework, a range of ‘generic’ learning outcomes have been considered for students undergoing higher education programs and among them it is necessary to have the fundamental generic skills such as literacy and numeracy at a level appropriate to the qualification type. This warrants for curriculum design approaches to contextualise the form and scope of these fundamental generic skills for supporting both students’ learning engagement in the course, as well as the graduate attributes required for employability and to progress within their chosen profession. Little research is reported in integrating such generic skills into discipline-specific learning outcomes. This paper explores the literature of the generic skills required for graduates from the discipline of Information Technology (IT) in relation to an Australian higher education institution. The paper presents the rationale of a proposed Bachelor of IT curriculum designed to contextualize the learning of these generic skills within the students’ discipline studies.

Keywords: curriculum, employability, generic skills, graduate attributes, higher education, information technology

Procedia PDF Downloads 253
29055 Geophysical Methods and Machine Learning Algorithms for Stuck Pipe Prediction and Avoidance

Authors: Ammar Alali, Mahmoud Abughaban

Abstract:

Cost reduction and drilling optimization is the goal of many drilling operators. Historically, stuck pipe incidents were a major segment of non-productive time (NPT) associated costs. Traditionally, stuck pipe problems are part of the operations and solved post-sticking. However, the real key to savings and success is in predicting the stuck pipe incidents and avoiding the conditions leading to its occurrences. Previous attempts in stuck-pipe predictions have neglected the local geology of the problem. The proposed predictive tool utilizes geophysical data processing techniques and Machine Learning (ML) algorithms to predict drilling activities events in real-time using surface drilling data with minimum computational power. The method combines two types of analysis: (1) real-time prediction, and (2) cause analysis. Real-time prediction aggregates the input data, including historical drilling surface data, geological formation tops, and petrophysical data, from wells within the same field. The input data are then flattened per the geological formation and stacked per stuck-pipe incidents. The algorithm uses two physical methods (stacking and flattening) to filter any noise in the signature and create a robust pre-determined pilot that adheres to the local geology. Once the drilling operation starts, the Wellsite Information Transfer Standard Markup Language (WITSML) live surface data are fed into a matrix and aggregated in a similar frequency as the pre-determined signature. Then, the matrix is correlated with the pre-determined stuck-pipe signature for this field, in real-time. The correlation used is a machine learning Correlation-based Feature Selection (CFS) algorithm, which selects relevant features from the class and identifying redundant features. The correlation output is interpreted as a probability curve of stuck pipe incidents prediction in real-time. Once this probability passes a fixed-threshold defined by the user, the other component, cause analysis, alerts the user of the expected incident based on set pre-determined signatures. A set of recommendations will be provided to reduce the associated risk. The validation process involved feeding of historical drilling data as live-stream, mimicking actual drilling conditions, of an onshore oil field. Pre-determined signatures were created for three problematic geological formations in this field prior. Three wells were processed as case studies, and the stuck-pipe incidents were predicted successfully, with an accuracy of 76%. This accuracy of detection could have resulted in around 50% reduction in NPT, equivalent to 9% cost saving in comparison with offset wells. The prediction of stuck pipe problem requires a method to capture geological, geophysical and drilling data, and recognize the indicators of this issue at a field and geological formation level. This paper illustrates the efficiency and the robustness of the proposed cross-disciplinary approach in its ability to produce such signatures and predicting this NPT event.

Keywords: drilling optimization, hazard prediction, machine learning, stuck pipe

Procedia PDF Downloads 223
29054 Multi Data Management Systems in a Cluster Randomized Trial in Poor Resource Setting: The Pneumococcal Vaccine Schedules Trial

Authors: Abdoullah Nyassi, Golam Sarwar, Sarra Baldeh, Mamadou S. K. Jallow, Bai Lamin Dondeh, Isaac Osei, Grant A. Mackenzie

Abstract:

A randomized controlled trial is the "gold standard" for evaluating the efficacy of an intervention. Large-scale, cluster-randomized trials are expensive and difficult to conduct, though. To guarantee the validity and generalizability of findings, high-quality, dependable, and accurate data management systems are necessary. Robust data management systems are crucial for optimizing and validating the quality, accuracy, and dependability of trial data. Regarding the difficulties of data gathering in clinical trials in low-resource areas, there is a scarcity of literature on this subject, which may raise concerns. Effective data management systems and implementation goals should be part of trial procedures. Publicizing the creative clinical data management techniques used in clinical trials should boost public confidence in the study's conclusions and encourage further replication. In the ongoing pneumococcal vaccine schedule study in rural Gambia, this report details the development and deployment of multi-data management systems and methodologies. We implemented six different data management, synchronization, and reporting systems using Microsoft Access, RedCap, SQL, Visual Basic, Ruby, and ASP.NET. Additionally, data synchronization tools were developed to integrate data from these systems into the central server for reporting systems. Clinician, lab, and field data validation systems and methodologies are the main topics of this report. Our process development efforts across all domains were driven by the complexity of research project data collected in real-time data, online reporting, data synchronization, and ways for cleaning and verifying data. Consequently, we effectively used multi-data management systems, demonstrating the value of creative approaches in enhancing the consistency, accuracy, and reporting of trial data in a poor resource setting.

Keywords: data management, data collection, data cleaning, cluster-randomized trial

Procedia PDF Downloads 17
29053 A Weighted Group EI Incorporating Role Information for More Representative Group EI Measurement

Authors: Siyu Wang, Anthony Ward

Abstract:

Emotional intelligence (EI) is a well-established personal characteristic. It has been viewed as a critical factor which can influence an individual's academic achievement, ability to work and potential to succeed. When working in a group, EI is fundamentally connected to the group members' interaction and ability to work as a team. The ability of a group member to intelligently perceive and understand own emotions (Intrapersonal EI), to intelligently perceive and understand other members' emotions (Interpersonal EI), and to intelligently perceive and understand emotions between different groups (Cross-boundary EI) can be considered as Group emotional intelligence (Group EI). In this research, a more representative Group EI measurement approach, which incorporates the information of the composition of a group and an individual’s role in that group, is proposed. To demonstrate the claim of being more representative Group EI measurement approach, this study adopts a multi-method research design, involving a combination of both qualitative and quantitative techniques to establish a metric of Group EI. From the results, it can be concluded that by introducing the weight coefficient of each group member on group work into the measurement of Group EI, Group EI will be more representative and more capable of understanding what happens during teamwork than previous approaches.

Keywords: case study, emotional intelligence, group EI, multi-method research

Procedia PDF Downloads 121
29052 Determination of Nanomolar Mercury (II) by Using Multi-Walled Carbon Nanotubes Modified Carbon Zinc/Aluminum Layered Double Hydroxide – 3 (4-Methoxyphenyl) Propionate Nanocomposite Paste Electrode

Authors: Illyas Md Isa, Sharifah Norain Mohd Sharif, Norhayati Hashima

Abstract:

A mercury(II) sensor was developed by using multi-walled carbon nanotubes (MWCNTs) paste electrode modified with Zn/Al layered double hydroxide-3(4-methoxyphenyl)propionate nanocomposite (Zn/Al-HMPP). The optimum conditions by cyclic voltammetry were observed at electrode composition 2.5% (w/w) of Zn/Al-HMPP/MWCNTs, 0.4 M potassium chloride, pH 4.0, and scan rate of 100 mVs-1. The sensor exhibited wide linear range from 1x10-3 M to 1x10-7 M Hg2+ and 1x10-7 M to 1x10-9 M Hg2+, with a detection limit of 1x10-10 M Hg2+. The high sensitivity of the proposed electrode towards Hg(II) was confirmed by double potential-step chronocoulometry which indicated these values; diffusion coefficient 1.5445 x 10-9 cm2 s-1, surface charge 524.5 µC s-½ and surface coverage 4.41 x 10-2 mol cm-2. The presence of 25-fold concentration of most metal ions had no influence on the anodic peak current. With characteristics such as high sensitivity, selectivity and repeatability the electrode was then proposed as the appropriate alternative for the determination of mercury(II).

Keywords: cyclic voltammetry, mercury(II), modified carbon paste electrode, nanocomposite

Procedia PDF Downloads 306
29051 Level Set Based Extraction and Update of Lake Contours Using Multi-Temporal Satellite Images

Authors: Yindi Zhao, Yun Zhang, Silu Xia, Lixin Wu

Abstract:

The contours and areas of water surfaces, especially lakes, often change due to natural disasters and construction activities. It is an effective way to extract and update water contours from satellite images using image processing algorithms. However, to produce optimal water surface contours that are close to true boundaries is still a challenging task. This paper compares the performances of three different level set models, including the Chan-Vese (CV) model, the signed pressure force (SPF) model, and the region-scalable fitting (RSF) energy model for extracting lake contours. After experiment testing, it is indicated that the RSF model, in which a region-scalable fitting (RSF) energy functional is defined and incorporated into a variational level set formulation, is superior to CV and SPF, and it can get desirable contour lines when there are “holes” in the regions of waters, such as the islands in the lake. Therefore, the RSF model is applied to extracting lake contours from Landsat satellite images. Four temporal Landsat satellite images of the years of 2000, 2005, 2010, and 2014 are used in our study. All of them were acquired in May, with the same path/row (121/036) covering Xuzhou City, Jiangsu Province, China. Firstly, the near infrared (NIR) band is selected for water extraction. Image registration is conducted on NIR bands of different temporal images for information update, and linear stretching is also done in order to distinguish water from other land cover types. Then for the first temporal image acquired in 2000, lake contours are extracted via the RSF model with initialization of user-defined rectangles. Afterwards, using the lake contours extracted the previous temporal image as the initialized values, lake contours are updated for the current temporal image by means of the RSF model. Meanwhile, the changed and unchanged lakes are also detected. The results show that great changes have taken place in two lakes, i.e. Dalong Lake and Panan Lake, and RSF can actually extract and effectively update lake contours using multi-temporal satellite image.

Keywords: level set model, multi-temporal image, lake contour extraction, contour update

Procedia PDF Downloads 362
29050 Effects of Recognition of Customer Feedback on Relationships between Emotional Labor and Job Satisfaction: Focusing On Call Centers That Offer Professional Services

Authors: Kiyoko Yoshimura, Yasunobu Kino

Abstract:

Focusing on professional call centers where workers with expertise perform services, this study aims to clarify the relationships between emotional labor and job satisfaction and the effects of recognition of customer feedback. Since the professional call center operators consist of professional license holders (qualification holders) and those who do not (non-holders), the following three points are analyzed in the two groups by using covariance structure analysis and simultaneous multi-population analysis: 1) The relationship between emotional labor and job satisfaction, 2) customer feedback and job satisfaction, and 3) The intermediation effect between the emotional labor of customer feedback and job satisfaction. The following results are obtained: i) no direct effect is found between job satisfaction and emotional labor for qualification holders and non-holders, ii) for qualification holders and non-holders, recognition of positive feedback and recognition of negative feedback had positive and negative effects on job satisfaction, respectively, iii) for qualification and non-holders, "consideration for colleagues" influences job satisfaction by recognizing positive feedback, and iv) only for qualification holders, the factors "customer-oriented emotional expression" and "emotional disharmony" have a positive and negative effect on job satisfaction, respectively, through recognition of positive feedback and recognition of negative feedback.

Keywords: call center, emotional labor, professional service, job satisfaction, customer feedback

Procedia PDF Downloads 105
29049 Vaccine Development for Newcastle Disease Virus in Poultry

Authors: Muhammad Asif Rasheed

Abstract:

Newcastle disease virus (NDV), an avian orthoavulavirus, is a causative agent of Newcastle disease named (NDV) and can cause even the epidemics when the disease is not treated. Previously several vaccines based on attenuated and inactivated viruses have been reported, which are rendered useless with the passage of time due to versatile changes in viral genome. Therefore, we aimed to develop an effective multi-epitope vaccine against the haemagglutinin neuraminidase (HN) protein of 26 NDV strains from Pakistan through a modern immunoinformatic approaches. As a result, a vaccine chimaera was constructed by combining T-cell and B-cell epitopes with the appropriate linkers and adjuvant. The designed vaccine was highly immunogenic, non-allergen, and antigenic; therefore, the potential 3D-structureof multi epitope vaccine was constructed, refined, and validated. A molecular docking study of a multiepitope vaccine candidate with the chicken Toll-like receptor-4 indicated successful binding. An In silico immunological simulation was used to evaluate the candidate vaccine's ability to elicit an effective immune response. According to the computational studies, the proposed multiepitope vaccine is physically stable and may induce immune responses, whichsuggested it a strong candidate against 26 Newcastle disease virus strains from Pakistan. A wet lab study is under process to confirm the results.

Keywords: epitopes, newcastle disease virus, paramyxovirus virus, vaccine

Procedia PDF Downloads 117
29048 Measurement of Ionospheric Plasma Distribution over Myanmar Using Single Frequency Global Positioning System Receiver

Authors: Win Zaw Hein, Khin Sandar Linn, Su Su Yi Mon, Yoshitaka Goto

Abstract:

The Earth ionosphere is located at the altitude of about 70 km to several 100 km from the ground, and it is composed of ions and electrons called plasma. In the ionosphere, these plasma makes delay in GPS (Global Positioning System) signals and reflect in radio waves. The delay along the signal path from the satellite to the receiver is directly proportional to the total electron content (TEC) of plasma, and this delay is the largest error factor in satellite positioning and navigation. Sounding observation from the top and bottom of the ionosphere was popular to investigate such ionospheric plasma for a long time. Recently, continuous monitoring of the TEC using networks of GNSS (Global Navigation Satellite System) observation stations, which are basically built for land survey, has been conducted in several countries. However, in these stations, multi-frequency support receivers are installed to estimate the effect of plasma delay using their frequency dependence and the cost of multi-frequency support receivers are much higher than single frequency support GPS receiver. In this research, single frequency GPS receiver was used instead of expensive multi-frequency GNSS receivers to measure the ionospheric plasma variation such as vertical TEC distribution. In this measurement, single-frequency support ublox GPS receiver was used to probe ionospheric TEC. The location of observation was assigned at Mandalay Technological University in Myanmar. In the method, the ionospheric TEC distribution is represented by polynomial functions for latitude and longitude, and parameters of the functions are determined by least-squares fitting on pseudorange data obtained at a known location under an assumption of thin layer ionosphere. The validity of the method was evaluated by measurements obtained by the Japanese GNSS observation network called GEONET. The performance of measurement results using single-frequency of GPS receiver was compared with the results by dual-frequency measurement.

Keywords: ionosphere, global positioning system, GPS, ionospheric delay, total electron content, TEC

Procedia PDF Downloads 131
29047 Quantifying the Second-Level Digital Divide on Sub-National Level with a Composite Index

Authors: Vladimir Korovkin, Albert Park, Evgeny Kaganer

Abstract:

The paper studies the second-level digital divide (the one defined by the way how digital technology is used in everyday life) between regions of the Russian Federation. The paper offers a systemic review of literature on the measurement of the digital divide; based upon this it suggests a composite Digital Life Index, that captures the complex multi-dimensional character of the phenomenon. The model of the index studies separately the digital supply and demand across seven independent dimensions providing for 14 subindices. The Index is based on Internet-borne data, a distinction from traditional research approaches that rely on official statistics or surveys. Regression analysis is used to determine the relative importance of factors like income, human capital, and policy in determining the digital divide. The result of the analysis suggests that the digital divide is driven more by the differences in demand (defined by consumer competencies) than in supply; the role of income is insignificant, and the quality of human capital is the key determinant of the divide. The paper advances the existing methodological literature on the issue and can also inform practical decision-making regarding the strategies of national and regional digital development.

Keywords: digital transformation, second-level digital divide, composite index, digital policy, regional development, Russia

Procedia PDF Downloads 180
29046 Detection of Keypoint in Press-Fit Curve Based on Convolutional Neural Network

Authors: Shoujia Fang, Guoqing Ding, Xin Chen

Abstract:

The quality of press-fit assembly is closely related to reliability and safety of product. The paper proposed a keypoint detection method based on convolutional neural network to improve the accuracy of keypoint detection in press-fit curve. It would provide an auxiliary basis for judging quality of press-fit assembly. The press-fit curve is a curve of press-fit force and displacement. Both force data and distance data are time-series data. Therefore, one-dimensional convolutional neural network is used to process the press-fit curve. After the obtained press-fit data is filtered, the multi-layer one-dimensional convolutional neural network is used to perform the automatic learning of press-fit curve features, and then sent to the multi-layer perceptron to finally output keypoint of the curve. We used the data of press-fit assembly equipment in the actual production process to train CNN model, and we used different data from the same equipment to evaluate the performance of detection. Compared with the existing research result, the performance of detection was significantly improved. This method can provide a reliable basis for the judgment of press-fit quality.

Keywords: keypoint detection, curve feature, convolutional neural network, press-fit assembly

Procedia PDF Downloads 224
29045 3D Human Body Reconstruction Based on Multiple Viewpoints

Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo

Abstract:

The aim of this study was to improve the effects of human body 3D reconstruction. The MvP algorithm was adopted to obtain key point information from multiple perspectives. This algorithm allowed the capture of human posture and joint positions from multiple angles, providing more comprehensive and accurate data. The study also incorporated the SMPL-X model, which has been widely used for human body modeling, to achieve more accurate 3D reconstruction results. The use of the MvP algorithm made it possible to observe the reconstructed object from multiple angles, thus reducing the problems of blind spots and missing information. This algorithm was able to effectively capture key point information, including the position and rotation angle of limbs, providing key data for subsequent 3D reconstruction. Compared with traditional single-view methods, the method of multi-view fusion significantly improved the accuracy and stability of reconstruction. By combining the MvP algorithm with the SMPL-X model, we successfully achieved better human body 3D reconstruction effects. The SMPL-X model is highly scalable and can generate highly realistic 3D human body models, thus providing more detail and shape information.

Keywords: 3D human reconstruction, multi-view, joint point, SMPL-X

Procedia PDF Downloads 66
29044 Against the Philosophical-Scientific Racial Project of Biologizing Race

Authors: Anthony F. Peressini

Abstract:

The concept of race has recently come prominently back into discussion in the context of medicine and medical science, along with renewed effort to biologize racial concepts. This paper argues that this renewed effort to biologize race by way of medicine and population genetics fail on their own terms, and more importantly, that the philosophical project of biologizing race ought to be recognized for what it is—a retrograde racial project—and abandoned. There is clear agreement that standard racial categories and concepts cannot be grounded in the old way of racial naturalism, which understand race as a real, interest-independent biological/metaphysical category in which its members share “physical, moral, intellectual, and cultural characteristics.” But equally clear is the very real and pervasive presence of racial concepts in individual and collective consciousness and behavior, and so it remains a pressing area in which to seek deeper understanding. Recent philosophical work has endeavored to reconcile these two observations by developing a “thin” conception of race, grounded in scientific concepts but without the moral and metaphysical content. Such “thin,” science-based analyses take the “commonsense” or “folk” sense of race as it functions in contemporary society as the starting point for their philosophic-scientific projects to biologize racial concepts. A “philosophic-scientific analysis” is a special case of the cornerstone of analytic philosophy: a conceptual analysis. That is, a rendering of a concept into the more perspicuous concepts that constitute it. Thus a philosophic-scientific account of a concept is an attempt to work out an analysis of a concept that makes use of empirical science's insights to ground, legitimate and explicate the target concept in terms of clearer concepts informed by empirical results. The focus in this paper is on three recent philosophic-scientific cases for retaining “race” that all share this general analytic schema, but that make use of “medical necessity,” population genetics, and human genetic clustering, respectively. After arguing that each of these three approaches suffers from internal difficulties, the paper considers the general analytic schema employed by such biologizations of race. While such endeavors are inevitably prefaced with the disclaimer that the theory to follow is non-essentialist and non-racialist, the case will be made that such efforts are not neutral scientific or philosophical projects but rather are what sociologists call a racial project, that is, one of many competing efforts that conjoin a representation of what race means to specific efforts to determine social and institutional arrangements of power, resources, authority, etc. Accordingly, philosophic-scientific biologizations of race, since they begin from and condition their analyses on “folk” conceptions, cannot pretend to be “prior to” other disciplinary insights, nor to transcend the social-political dynamics involved in formulating theories of race. As a result, such traditional philosophical efforts can be seen to be disciplinarily parochial and to address only a caricature of a large and important human problem—and thereby further contributing to the unfortunate isolation of philosophical thinking about race from other disciplines.

Keywords: population genetics, ontology of race, race-based medicine, racial formation theory, racial projects, racism, social construction

Procedia PDF Downloads 271
29043 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe

Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis

Abstract:

The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.

Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM

Procedia PDF Downloads 422
29042 The Perspective of British Politicians on English Identity: Qualitative Study of Parliamentary Debates, Blogs, and Interviews

Authors: Victoria Crynes

Abstract:

The question of England’s role in Britain is increasingly relevant due to the ongoing rise in citizens identifying as English. Furthermore, the Brexit Referendum was predominantly supported by constituents identifying as English. Few politicians appear to comprehend how Englishness is politically manifested. Politics and the media have depicted English identity as a negative and extremist problem - an inaccurate representation that ignores the breadth of English identifying citizens. This environment prompts the question, 'How are British Politicians Addressing the Modern English Identity Question?' Parliamentary debates, political blogs, and interviews are synthesized to establish a more coherent understanding of the current political attitudes towards English identity, the perceived nature of English identity, and the political manifestation of English representation and governance. Analyzed parliamentary debates addressed the democratic structure of English governance through topics such as English votes for English laws, devolution, and the union. The blogs examined include party-based, multi-author style blogs, and independently authored blogs by politicians, which provide a dynamic and up-to-date representation of party and politician viewpoints. Lastly, fourteen semi-structured interviews of British politicians provide a nuanced perspective on how politicians conceptualize Englishness. Interviewee selection was based on three criteria: (i) Members of Parliament (MP) known for discussing English identity politics, (ii) MPs of strongly English identifying constituencies, (iii) MPs with minimal English identity affiliation. Analysis of parliamentary debates reveals the discussion of English representation has gained little momentum. Many politicians fail to comprehend who the English are, why they desire greater representation and believe that increased recognition of the English would disrupt the unity of the UK. These debates highlight the disconnect of parliament from the disenfranchised English towns. A failure to recognize the legitimacy of English identity politics generates an inability for solution-focused debates to occur. Political blogs demonstrate cross-party recognition of growing English disenfranchisement. The dissatisfaction with British politics derives from multiple factors, including economic decline, shifting community structures, and the delay of Brexit. The left-behind communities have seen little response from Westminster, which is often contrasted to the devolved and louder voices of the other UK nations. Many blogs recognize the need for a political response to the English and lament the lack of party-level initiatives. In comparison, interviews depict an array of local-level initiatives reconnecting MPs to community members. Local efforts include town trips to Westminster, multi-cultural cooking classes, and English language courses. These efforts begin to rebuild positive, local narratives, promote engagement across community sectors, and acknowledge the English voices. These interviewees called for large-scale, political action. Meanwhile, several interviewees denied the saliency of English identity. For them, the term held only extremist narratives. The multi-level analysis reveals continued uncertainty on Englishness within British politics, contrasted with increased recognition of its saliency by politicians. It is paramount that politicians increase discussions on English identity politics to avoid increased alienation of English citizens and to rebuild trust in the abilities of Westminster.

Keywords: British politics, contemporary identity politics and its impacts, English identity, English nationalism, identity politics

Procedia PDF Downloads 109
29041 ROSgeoregistration: Aerial Multi-Spectral Image Simulator for the Robot Operating System

Authors: Andrew R. Willis, Kevin Brink, Kathleen Dipple

Abstract:

This article describes a software package called ROS-georegistration intended for use with the robot operating system (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test, and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google’s Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and windspeed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Keywords: EO-to-EO, EO-to-SAR, flight simulation, georegistration, image generation, robot operating system, vision-based navigation

Procedia PDF Downloads 99
29040 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 523
29039 Improved Multi-Channel Separation Algorithm for Satellite-Based Automatic Identification System Signals Based on Artificial Bee Colony and Adaptive Moment Estimation

Authors: Peng Li, Luan Wang, Haifeng Fei, Renhong Xie, Yibin Rui, Shanhong Guo

Abstract:

The applications of satellite-based automatic identification system (S-AIS) pave the road for wide-range maritime traffic monitoring and management. But the coverage of satellite’s view includes multiple AIS self-organizing networks, which leads to the collision of AIS signals from different cells. The contribution of this work is to propose an improved multi-channel blind source separation algorithm based on Artificial Bee Colony (ABC) and advanced stochastic optimization to perform separation of the mixed AIS signals. The proposed approach adopts modified ABC algorithm to get an optimized initial separating matrix, which can expedite the initialization bias correction, and utilizes the Adaptive Moment Estimation (Adam) to update the separating matrix by adjusting the learning rate for each parameter dynamically. Simulation results show that the algorithm can speed up convergence and lead to better performance in separation accuracy.

Keywords: satellite-based automatic identification system, blind source separation, artificial bee colony, adaptive moment estimation

Procedia PDF Downloads 184
29038 Multi-Scale Modeling of Ti-6Al-4V Mechanical Behavior: Size, Dispersion and Crystallographic Texture of Grains Effects

Authors: Fatna Benmessaoud, Mohammed Cheikh, Vencent Velay, Vanessa Vidal, Farhad Rezai-Aria, Christine Boher

Abstract:

Ti-6Al-4V titanium alloy is one of the most widely used materials in aeronautical and aerospace industries. Because of its high specific strength, good fatigue, and corrosion resistance, this alloy is very suitable for moderate temperature applications. At room temperature, Ti-6Al-4V mechanical behavior is generally controlled by the behavior of alpha phase (beta phase percent is less than 8%). The plastic strain of this phase notably based on crystallographic slip can be hindered by various obstacles and mechanisms (crystal lattice friction, sessile dislocations, strengthening by solute atoms and grain boundaries…). The grains aspect of alpha phase (its morphology and texture) and the nature of its crystallographic lattice (which is hexagonal compact) give to plastic strain heterogeneous, discontinuous and anisotropic characteristics at the local scale. The aim of this work is to develop a multi-scale model for Ti-6Al-4V mechanical behavior using crystal plasticity approach; this multi-scale model is used then to investigate grains size, dispersion of grains size, crystallographic texture and slip systems activation effects on Ti-6Al-4V mechanical behavior under monotone quasi-static loading. Nine representative elementary volume (REV) are built for taking into account the physical elements (grains size, dispersion and crystallographic) mentioned above, then boundary conditions of tension test are applied. Finally, simulation of the mechanical behavior of Ti-6Al-4V and study of slip systems activation in alpha phase is reported. The results show that the macroscopic mechanical behavior of Ti-6Al-4V is strongly linked to the active slip systems family (prismatic, basal or pyramidal). The crystallographic texture determines which family of slip systems can be activated; therefore it gives to the plastic strain a heterogeneous character thus an anisotropic macroscopic mechanical behavior of Ti-6Al-4V alloy modeled. The grains size influences also on mechanical proprieties of Ti-6Al-4V, especially on the yield stress; by decreasing of the grain size, the yield strength increases. Finally, the grains' distribution which characterizes the morphology aspect (homogeneous or heterogeneous) gives to the deformation fields distribution enough heterogeneity because the crystallographic slip is easier in large grains compared to small grains, which generates a localization of plastic deformation in certain areas and a concentration of stresses in others.

Keywords: multi-scale modeling, Ti-6Al-4V alloy, crystal plasticity, grains size, crystallographic texture

Procedia PDF Downloads 156
29037 Towards Sustainable Evolution of Bioeconomy: The Role of Technology and Innovation Management

Authors: Ronald Orth, Johanna Haunschild, Sara Tsog

Abstract:

The bioeconomy is an inter- and cross-disciplinary field covering a large number and wide scope of existing and emerging technologies. It has a great potential to contribute to the transformation process of industry landscape and ultimately drive the economy towards sustainability. However, bioeconomy per se is not necessarily sustainable and technology should be seen as an enabler rather than panacea to all our ecological, social and economic issues. Therefore, to draw and maximize benefits from bioeconomy in terms of sustainability, we propose that innovative activities should encompass not only novel technologies and bio-based new materials but also multifocal innovations. For multifocal innovation endeavors, innovation management plays a substantial role, as any innovation emerges in a complex iterative process where communication and knowledge exchange among relevant stake holders has a pivotal role. The knowledge generation and innovation are although at the core of transition towards a more sustainable bio-based economy, to date, there is a significant lack of concepts and models that approach bioeconomy from the innovation management approach. The aim of this paper is therefore two-fold. First, it inspects the role of transformative approach in the adaptation of bioeconomy that contributes to the environmental, ecological, social and economic sustainability. Second, it elaborates the importance of technology and innovation management as a tool for smooth, prompt and effective transition of firms to the bioeconomy. We conduct a qualitative literature study on the sustainability challenges that bioeconomy entails thus far using Science Citation Index and based on grey literature, as major economies e.g. EU, USA, China and Brazil have pledged to adopt bioeconomy and have released extensive publications on the topic. We will draw an example on the forest based business sector that is transforming towards the new green economy more rapidly as expected, although this sector has a long-established conventional business culture with consolidated and fully fledged industry. Based on our analysis we found that a successful transition to sustainable bioeconomy is conditioned on heterogenous and contested factors in terms of stakeholders , activities and modes of innovation. In addition, multifocal innovations occur when actors from interdisciplinary fields engage in intensive and continuous interaction where the focus of innovation is allocated to a field of mutually evolving socio-technical practices that correspond to the aims of the novel paradigm of transformative innovation policy. By adopting an integrated and systems approach as well as tapping into various innovation networks and joining global innovation clusters, firms have better chance of creating an entire new chain of value added products and services. This requires professionals that have certain capabilities and skills such as: foresight for future markets, ability to deal with complex issues, ability to guide responsible R&D, ability of strategic decision making, manage in-depth innovation systems analysis including value chain analysis. Policy makers, on the other hand, need to acknowledge the essential role of firms in the transformative innovation policy paradigm.

Keywords: bioeconomy, innovation and technology management, multifocal innovation, sustainability, transformative innovation policy

Procedia PDF Downloads 124
29036 Determination of Nanomolar Mercury (II) by Using Multi-Walled Carbon Nanotubes Modified Carbon Zinc/Aluminum Layered Double Hydroxide-3(4-Methoxyphenyl) Propionate Nanocomposite Paste Electrode

Authors: Illyas Md Isa, Sharifah Norain Mohd Sharif, Norhayati Hashim

Abstract:

A mercury(II) sensor was developed by using multi-walled carbon nano tubes (MWCNTs) paste electrode modified with Zn/Al layered double hydroxide-3(4-methoxyphenyl) propionate nano composite (Zn/Al-HMPP). The optimum conditions by cyclic voltammetry were observed at electrode composition 2.5% (w/w) of Zn/Al-HMPP/MWCNTs, 0.4 M potassium chloride, pH 4.0, and scan rate of 100 mVs-1. The sensor exhibited wide linear range from 1x10-3 M to 1x10-7 M Hg2+ and 1x10-7 M to 1x10-9 M Hg2+, with a detection limit of 1 x 10-10 M Hg2+. The high sensitivity of the proposed electrode towards Hg(II) was confirmed by double potential-step chronocoulometry which indicated these values; diffusion coefficient 1.5445 x 10-9 cm2 s-1, surface charge 524.5 µC s-½ and surface coverage 4.41 x 10-2 mol cm-2. The presence of 25-fold concentration of most metal ions had no influence on the anodic peak current. With characteristics such as high sensitivity, selectivity and repeatability the electrode was then proposed as the appropriate alternative for the determination of mercury.

Keywords: Cyclic voltammetry, Mercury(II), Modified carbon paste electrode, Nanocomposite

Procedia PDF Downloads 431
29035 A Meta-Analysis of School-Based Suicide Prevention for Adolescents and Meta-Regressions of Contextual and Intervention Factors

Authors: E. H. Walsh, J. McMahon, M. P. Herring

Abstract:

Post-primary school-based suicide prevention (PSSP) is a valuable avenue to reduce suicidal behaviours in adolescents. The aims of this meta-analysis and meta-regression were 1) to quantify the effect of PSSP interventions on adolescent suicide ideation (SI) and suicide attempts (SA), and 2) to explore how intervention effects may vary based on important contextual and intervention factors. This study provides further support to the benefits of PSSP by demonstrating lower suicide outcomes in over 30,000 adolescents following PSSP and mental health interventions and tentatively suggests that intervention effectiveness may potentially vary based on intervention factors. The protocol for this study is registered on PROSPERO (ID=CRD42020168883). Population, intervention, comparison, outcomes, and study design (PICOs) defined eligible studies as cluster randomised studies (n=12) containing PSSP and measuring suicide outcomes. Aggregate electronic database EBSCO host, Web of Science, and Cochrane Central Register of Controlled Trials databases were searched. Cochrane bias tools for cluster randomised studies demonstrated that half of the studies were rated as low risk of bias. The Egger’s Regression Test adapted for multi-level modelling indicated that publication bias was not an issue (all ps > .05). Crude and corresponding adjusted pooled log odds ratios (OR) were computed using the Metafor package in R, yielding 12 SA and 19 SI effects. Multi-level random-effects models accounting for dependencies of effects from the same study revealed that in crude models, compared to controls, interventions were significantly associated with 13% (OR=0.87, 95% confidence interval (CI), [0.78,0.96], Q18 =15.41, p=0.63) and 34% (OR=0.66, 95%CI [0.47,0.91], Q10=16.31, p=0.13) lower odds of SI and SA, respectively. Adjusted models showed similar odds reductions of 15% (OR=0.85, 95%CI[0.75,0.95], Q18=10.04, p=0.93) and 28% (OR=0.72, 95%CI[0.59,0.87], Q10=10.46, p=0.49) for SI and SA, respectively. Within-cluster heterogeneity ranged from no heterogeneity to low heterogeneity for SA across crude and adjusted models (0-9%). No heterogeneity was identified for SI across crude and adjusted models (0%). Pre-specified univariate moderator analyses were not significant for SA (all ps < 0.05). Variations in average pooled SA odds reductions across categories of various intervention characteristics were observed (all ps < 0.05), which preliminarily suggests that the effectiveness of interventions may potentially vary across intervention factors. These findings have practical implications for researchers, clinicians, educators, and decision-makers. Further investigation of important logical, theoretical, and empirical moderators on PSSP intervention effectiveness is recommended to establish how and when PSSP interventions best reduce adolescent suicidal behaviour.

Keywords: adolescents, contextual factors, post-primary school-based suicide prevention, suicide ideation, suicide attempts

Procedia PDF Downloads 97
29034 Sensitivity Analysis of the Heat Exchanger Design in Net Power Oxy-Combustion Cycle for Carbon Capture

Authors: Hirbod Varasteh, Hamidreza Gohari Darabkhani

Abstract:

The global warming and its impact on climate change is one of main challenges for current century. Global warming is mainly due to the emission of greenhouse gases (GHG) and carbon dioxide (CO2) is known to be the major contributor to the GHG emission profile. Whilst the energy sector is the primary source for CO2 emission, Carbon Capture and Storage (CCS) are believed to be the solution for controlling this emission. Oxyfuel combustion (Oxy-combustion) is one of the major technologies for capturing CO2 from power plants. For gas turbines, several Oxy-combustion power cycles (Oxyturbine cycles) have been investigated by means of thermodynamic analysis. NetPower cycle is one of the leading oxyturbine power cycles with almost full carbon capture capability from a natural gas fired power plant. In this manuscript, sensitivity analysis of the heat exchanger design in NetPower cycle is completed by means of process modelling. The heat capacity variation and supercritical CO2 with gaseous admixtures are considered for multi-zone analysis with Aspen Plus software. It is found that the heat exchanger design has a major role to increase the efficiency of NetPower cycle. The pinch-point analysis is done to extract the composite and grand composite curve for the heat exchanger. In this paper, relationship between the cycle efficiency and the minimum approach temperature (∆Tmin) of the heat exchanger has also been evaluated.  Increase in ∆Tmin causes a decrease in the temperature of the recycle flue gases (RFG) and an overall decrease in the required power for the recycled gas compressor. The main challenge in the design of heat exchangers in power plants is a tradeoff between the capital and operational costs. To achieve lower ∆Tmin, larger size of heat exchanger is required. This means a higher capital cost but leading to a better heat recovery and lower operational cost. To achieve this, ∆Tmin is selected from the minimum point in the diagrams of capital and operational costs. This study provides an insight into the NetPower Oxy-combustion cycle’s performance analysis and operational condition based on its heat exchanger design.

Keywords: carbon capture and storage, oxy-combustion, netpower cycle, oxy turbine cycles, zero emission, heat exchanger design, supercritical carbon dioxide, oxy-fuel power plant, pinch point analysis

Procedia PDF Downloads 201
29033 Business Domain Modelling Using an Integrated Framework

Authors: Mohammed Hasan Salahat, Stave Wade

Abstract:

This paper presents an application of a “Systematic Soft Domain Driven Design Framework” as a soft systems approach to domain-driven design of information systems development. The framework combining techniques from Soft Systems Methodology (SSM), the Unified Modeling Language (UML), and an implementation pattern knows as ‘Naked Objects’. This framework have been used in action research projects that have involved the investigation and modeling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, and a real case study ‘Information Retrieval System for Academic Research’ is used, in this paper, to show further practice and evaluation of the framework in different business domain. We argue that there are advantages from combining and using techniques from different methodologies in this way for business domain modeling. The framework is overviewed and justified as multi-methodology using Mingers Multi-Methodology ideas.

Keywords: SSM, UML, domain-driven design, soft domain-driven design, naked objects, soft language, information retrieval, multimethodology

Procedia PDF Downloads 555
29032 A High Reliable Space-Borne File System with Applications of Device Partition and Intra-Channel Pipeline in Nand Flash

Authors: Xin Li, Ji-Yang Yu, Yue-Hua Niu, Lu-Yuan Wang

Abstract:

As an inevitable chain of the space data acquirement system, space-borne storage system based on Nand Flash has gradually been implemented in spacecraft. In face of massive, parallel and varied data on board, efficient data management become an important issue of storage research. Face to the requirements of high-performance and reliability in Nand Flash storage system, a combination of hardware and file system design can drastically increase system dependability, even for missions with a very long duration. More sophisticated flash storage concepts with advanced operating systems have been researched to improve the reliability of Nand Flash storage system on satellites. In this paper, architecture of file system with multi-channel data acquisition and storage on board is proposed, which obtains large-capacity and high-performance with the combine of intra-channel pipeline and device partition in Nand Flash. Multi-channel data in different rate are stored as independent files with parallel-storage system in device partition, which assures the high-effective and reliable throughput of file treatments. For massive and high-speed data storage, an efficiency assessment model is established to calculate the bandwidth formula of intra-channel pipeline. Information tables designed in Magnetoresistive RAM (MRAM) hold the management of bad block in Nand Flash and the arrangement of file system address for the high-reliability of data storage. During the full-load test, the throughput of 3D PLUS Module 160Gb Nand Flash can reach 120Mbps for store and reach 120Mbps for playback, which efficiently satisfies the requirement of multi-channel data acquisition in Satellite. Compared with previous literature, the results of experiments verify the advantages of the proposed system.

Keywords: device partition architecture, intra-channel pipelining, nand flash, parallel storage

Procedia PDF Downloads 288
29031 3D Vision Transformer for Cervical Spine Fracture Detection and Classification

Authors: Obulesh Avuku, Satwik Sunnam, Sri Charan Mohan Janthuka, Keerthi Yalamaddi

Abstract:

In the United States alone, there are over 1.5 million spine fractures per year, resulting in about 17,730 spinal cord injuries. The cervical spine is where fractures in the spine most frequently occur. The prevalence of spinal fractures in the elderly has increased, and in this population, fractures may be harder to see on imaging because of coexisting degenerative illness and osteoporosis. Nowadays, computed tomography (CT) is almost completely used instead of radiography for the imaging diagnosis of adult spine fractures (x-rays). To stop neurologic degeneration and paralysis following trauma, it is vital to trace any vertebral fractures at the earliest. Many approaches have been proposed for the classification of the cervical spine [2d models]. We are here in this paper trying to break the bounds and use the vision transformers, a State-Of-The-Art- Model in image classification, by making minimal changes possible to the architecture of ViT and making it 3D-enabled architecture and this is evaluated using a weighted multi-label logarithmic loss. We have taken this problem statement from a previously held Kaggle competition, i.e., RSNA 2022 Cervical Spine Fracture Detection.

Keywords: cervical spine, spinal fractures, osteoporosis, computed tomography, 2d-models, ViT, multi-label logarithmic loss, Kaggle, public score, private score

Procedia PDF Downloads 108