Search results for: generalized Douglas-Weyl (GDW) metric
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1036

Search results for: generalized Douglas-Weyl (GDW) metric

316 Predicting Dose Level and Length of Time for Radiation Exposure Using Gene Expression

Authors: Chao Sima, Shanaz Ghandhi, Sally A. Amundson, Michael L. Bittner, David J. Brenner

Abstract:

In a large-scale radiologic emergency, potentially affected population need to be triaged efficiently using various biomarkers where personal dosimeters are not likely worn by the individuals. It has long been established that radiation injury can be estimated effectively using panels of genetic biomarkers. Furthermore, the rate of radiation, in addition to dose of radiation, plays a major role in determining biological responses. Therefore, a better and more accurate triage involves estimating both the dose level of the exposure and the length of time of that exposure. To that end, a large in vivo study was carried out on mice with internal emitter caesium-137 (¹³⁷Cs). Four different injection doses of ¹³⁷Cs were used: 157.5 μCi, 191 μCi, 214.5μCi, and 259 μCi. Cohorts of 6~7 mice from the control arm and each of the dose levels were sacrificed, and blood was collected 2, 3, 5, 7 and 14 days after injection for microarray RNA gene expression analysis. Using a generalized linear model with penalized maximum likelihood, a panel of 244 genes was established and both the doses of injection and the number of days after injection were accurately predicted for all 155 subjects using this panel. This has proven that microarray gene expression can be used effectively in radiation biodosimetry in predicting both the dose levels and the length of exposure time, which provides a more holistic view on radiation exposure and helps improving radiation damage assessment and treatment.

Keywords: caesium-137, gene expression microarray, multivariate responses prediction, radiation biodosimetry

Procedia PDF Downloads 186
315 Effect of PMMA Shield on the Patient Dose Equivalent from Photoneutrons Produced by High Energy Medical Linacs

Authors: Seyed Mehdi Hashemi, Gholamreza Raisali, Mehran Taheri

Abstract:

One of the important problems of using high energy linacs at IMRT is the production of photoneutrons. Besides the clinically useful photon beams, high-energy photon beams from medical linacs produce secondary neutrons. These photoneutrons increase the patient dose and may cause secondary malignancies. The effect of the shield on the reduction of photoneutron dose equivalent produced by a high energy medical linac at the patient plane is investigated in this study. To determine the photoneutron dose equivalent received to the patient a Varian linac working at 18 MV photon mode investigated. Photoneutron dose equivalent measured with Polycarbonate films of 0.25 mm thick. PC films placed at distances of 0, 10, 20, and 50 cm from the center of X-ray field on the patient couch. The results show that by increasing the distance from the center of the X-ray beam towards the periphery, the photoneutron dose equivalent decreases rapidly for both open and shielded fields and that by inserting the shield in the path of the X-ray beam, the photoneutron dose equivalent was decreased obviously compared to open field. Results show the shield, significantly reduces photoneutron dose equivalent to the patient. Results can be readily generalized to other models of medical linacs. It may be concluded that using this kind of shield can help more safe, inexpensive and efficient employment of high energy linacs in radiotherapy and IMRT.

Keywords: photoneutron, Linac, PMMA shield, equivalent dose

Procedia PDF Downloads 479
314 Budd-Chiari Syndrome: Common Presentation, Rare Disease

Authors: Aadil Khan, Yasser Chomayil, P. P. Venugopalan

Abstract:

Background: Budd-Chiari syndrome is caused by thrombosis of the hepatic veins and/or the thrombosis of the intrahepatic or suprahepatic IVC. The etiology remains idiopathic in 16% -35% of cases. Malignancy, rheumatological disorder, myeloproliferative disease, inheritable coagulopathy, infection or hyperestrogen state can be identified in many cases. Methodology: Review of case records of the patient presented to Aster Medcity, Emergency Department, Cochin. Introduction:17 years old female was presented to ED with fever, jaundice and abdominal distention since 1 week. O/E: Pallor+, icterus+. Abdomen- gross distension+, shifting dullness+, generalized anasarca+. USG abdomen showed hepatomegaly with mild coarse echotexture and moderate to gross ascites. CT abdomen and chest showed hepatomegaly with thrombosis of all three hepatic vein and moderate ascites suggestive of Budd-Chiari syndrome. Patient was taken for catheter vein thrombolysis. Venogram done the next day revealed almost > 50% opening of the right hepatic vein. Concurrent doppler showed colour and doppler signals in middle hepatic veins. She gradually improved and was discharged home on anticoagulant and adviced regular follow up. Conclusion: Being a rare disease in this young population, high suspicion is required when evaluating young patients with abdominal pain and jaundice.

Keywords: Budd-Chiari syndrome, rare disease, abdominal pain, India

Procedia PDF Downloads 263
313 Rising Velocity of a Non-Newtonian Liquids in Capillary Tubes

Authors: Reza Sabbagh, Linda Hasanovich, Aleksey Baldygin, David S. Nobes, Prashant R. Waghmare

Abstract:

The capillary filling process is significantly important to study for numerous applications such as the under filling of the material in electronic packaging or liquid hydrocarbons seepage through porous structure. The approximation of the fluid being Newtonian, i.e., linear relationship between the shear stress and deformation rate cannot be justified in cases where the extent of non-Newtonian behavior of liquid governs the surface driven transport, i.e., capillarity action. In this study, the capillary action of a non-Newtonian fluid is not only analyzed, but also the modified generalized theoretical analysis for the capillary transport is proposed. The commonly observed three regimes: surface forces dominant (travelling air-liquid interface), developing flow (viscous force dominant), and developed regimes (interfacial, inertial and viscous forces are comparable) are identified. The velocity field along each regime is quantified with Newtonian and non-Newtonian fluid in square shaped vertically oriented channel. Theoretical understanding of capillary imbibition process, particularly in the case of Newtonian fluids, is relied on the simplified assumption of a fully developed velocity profile which has been revisited for developing a modified theory for the capillary transport of non-Newtonian fluids. Furthermore, the development of the velocity profile from the entrance regime to the developed regime, for different power law fluids, is also investigated theoretically and experimentally.

Keywords: capillary, non-Newtonian flow, shadowgraphy, rising velocity

Procedia PDF Downloads 194
312 The Effect of Aerobic Exercise Training on the Improvement of Nursing Staff's Sleep Quality: A Randomized Controlled Study

Authors: Niu Shu Fen

Abstract:

Sleep disturbance is highly prevalent among shift-working nurses. We aimed to evaluate whether aerobic exercise (i.e., walking combined with jogging) improves objective Sleepparameters among female nurses at the end of an 8-week exercise program and 4 weeks after study completion. This single-blinded, parallel design, randomized controlled trial was conducted in the floor classroom of a would-be medical center in northern Taiwan. Sixtyeligible female nurses were randomly assigned to either aerobic exercise (n = 30) or usual care (n = 30) group. The moderate-intensity aerobic exercise program was performed over 5days (60 min per day) a week for 8 weeks after work hours. Objective sleep outcomes including total sleep time (TST), sleep onset latency (SOL), wake after sleep onset (WASO), and sleep efficiency (SE), were retrieved using an Actigraph device. A generalized estimated equation model was used for data analyses. The aerobic exercise group had significant improvements in TST and SE at 4 weeks and 8 weeks compared with baseline evaluation(TST: B = 70.49 and 55.96, both p < 0.001; SE: B = 5.21 and 3.98, p < 0.001 and 0.002).Significant between-group differences were observed in SOL and WASO at 4 weeks but not8 weeks compared with the baseline evaluation (SOL: B = −7.18, p = 0.03; WASO: B =−11.38, p = 0.008). The positive lasting effects for TST were observed only until the 4-week follow-up. To improve sleep quality and quantity, we encourage female nurses to regularly perform moderate-intensity aerobic exercise.

Keywords: sleep quality, aerobic exercise, nurses, shift work

Procedia PDF Downloads 133
311 Assessing Influence of End-Boundary Conditions on Stability and Second-Order Lateral Stiffness of Beam-Column Elements Embedded in Non-Homogeneous Soil

Authors: Carlos A. Vega-Posada, Jeisson Alejandro Higuita-Villa, Julio C. Saldarriaga-Molina

Abstract:

This paper presents a simplified analytical approach to conduct elastic stability and second-order lateral stiffness analyses of beam-column elements (i.e., piles) with generalized end-boundary conditions embedded on a homogeneous or non-homogeneous Pasternak foundation. The solution is derived using the well-known Differential Transformation Method (DTM), and it consists simply of solving a system of two linear algebraic equations. Using other conventional approaches to solve the governing differential equation of the proposed element can be cumbersome and the solution challenging to implement, especially when the non-homogeneity of the soil is considered. The proposed formulation includes the effects of i) any rotational or lateral transverse spring at the ends of the pile, ii) any external transverse load acting along the pile, iii) soil non-homogeneity, and iv) the second-parameter of the elastic foundation (i.e., shear layer connecting the springs at the top). A parametric study is conducted to investigate the effects of different modulus of subgrade reactions, degrees of non-homogeneities, and intermediate end-boundary conditions on the pile response. The same set of equations can be used to conduct both elastic stability and static analyses. Comprehensive examples are presented to show the simplicity and practicability of the proposed method.

Keywords: elastic stability, second-order lateral stiffness, soil-non-homogeneity, pile analysis

Procedia PDF Downloads 200
310 Fintech Credit and Bank Efficiency Two-way Relationship: A Comparison Study Across Country Groupings

Authors: Tan Swee Liang

Abstract:

This paper studies the two-way relationship between fintech credit and banking efficiency using the Generalized panel Method of Moment (GMM) estimation in structural equation modeling (SEM). Banking system efficiency, defined as its ability to produce the existing level of outputs with minimal inputs, is measured using input-oriented data envelopment analysis (DEA), where the whole banking system of an economy is treated as a single DMU. Banks are considered an intermediary between depositors and borrowers, utilizing inputs (deposits and overhead costs) to provide outputs (increase credits to the private sector and its earnings). Analysis of the interrelationship between fintech credit and bank efficiency is conducted to determine the impact in different country groupings (ASEAN, Asia and OECD), in particular the banking system response to fintech credit platforms. Our preliminary results show that banks do respond to the greater pressure caused by fintech platforms to enhance their efficiency, but differently across the different groups. The author’s earlier research on ASEAN-5 high bank overhead costs (as a share of total assets) as the determinant of economic growth suggests that expenses may not have been channeled efficiently to income-generating activities. One practical implication of the findings is that policymakers should enable alternative financing, such as fintech credit, as a warning or encouragement for banks to improve their efficiency.

Keywords: fintech lending, banking efficiency, data envelopment analysis, structural equation modeling

Procedia PDF Downloads 76
309 Evaluation of Video Quality Metrics and Performance Comparison on Contents Taken from Most Commonly Used Devices

Authors: Pratik Dhabal Deo, Manoj P.

Abstract:

With the increasing number of social media users, the amount of video content available has also significantly increased. Currently, the number of smartphone users is at its peak, and many are increasingly using their smartphones as their main photography and recording devices. There have been a lot of developments in the field of Video Quality Assessment (VQA) and metrics like VMAF, SSIM etc. are said to be some of the best performing metrics, but the evaluation of these metrics is dominantly done on professionally taken video contents using professional tools, lighting conditions etc. No study particularly pinpointing the performance of the metrics on the contents taken by users on very commonly available devices has been done. Datasets that contain a huge number of videos from different high-end devices make it difficult to analyze the performance of the metrics on the content from most used devices even if they contain contents taken in poor lighting conditions using lower-end devices. These devices face a lot of distortions due to various factors since the spectrum of contents recorded on these devices is huge. In this paper, we have presented an analysis of the objective VQA metrics on contents taken only from most used devices and their performance on them, focusing on full-reference metrics. To carry out this research, we created a custom dataset containing a total of 90 videos that have been taken from three most commonly used devices, and android smartphone, an IOS smartphone and a DSLR. On the videos taken on each of these devices, the six most common types of distortions that users face have been applied on addition to already existing H.264 compression based on four reference videos. These six applied distortions have three levels of degradation each. A total of the five most popular VQA metrics have been evaluated on this dataset and the highest values and the lowest values of each of the metrics on the distortions have been recorded. Finally, it is found that blur is the artifact on which most of the metrics didn’t perform well. Thus, in order to understand the results better the amount of blur in the data set has been calculated and an additional evaluation of the metrics was done using HEVC codec, which is the next version of H.264 compression, on the camera that proved to be the sharpest among the devices. The results have shown that as the resolution increases, the performance of the metrics tends to become more accurate and the best performing metric among them is VQM with very few inconsistencies and inaccurate results when the compression applied is H.264, but when the compression is applied is HEVC, SSIM and VMAF have performed significantly better.

Keywords: distortion, metrics, performance, resolution, video quality assessment

Procedia PDF Downloads 193
308 Tenants Use Less Input on Rented Plots: Evidence from Northern Ethiopia

Authors: Desta Brhanu Gebrehiwot

Abstract:

The study aims to investigate the impact of land tenure arrangements on fertilizer use per hectare in Northern Ethiopia. Household and Plot level data are used for analysis. Land tenure contracts such as sharecropping and fixed rent arrangements have endogeneity. Different unobservable characteristics may affect renting-out decisions. Thus, the appropriate method of analysis was the instrumental variable estimation technic. Therefore, the family of instrumental variable estimation methods two-stage least-squares regression (2SLS, the generalized method of moments (GMM), Limited information maximum likelihood (LIML), and instrumental variable Tobit (IV-Tobit) was used. Besides, a method to handle a binary endogenous variable is applied, which uses a two-step estimation. In the first step probit model includes instruments, and in the second step, maximum likelihood estimation (MLE) (“etregress” command in Stata 14) was used. There was lower fertilizer use per hectare on sharecropped and fixed rented plots relative to owner-operated. The result supports the Marshallian inefficiency principle in sharecropping. The difference in fertilizer use per hectare could be explained by a lack of incentivized detailed contract forms, such as giving more proportion of the output to the tenant under sharecropping contracts, which motivates to use of more fertilizer in rented plots to maximize the production because most sharecropping arrangements share output equally between tenants and landlords.

Keywords: tenure-contracts, endogeneity, plot-level data, Ethiopia, fertilizer

Procedia PDF Downloads 72
307 Waterborne Platooning: Cost and Logistic Analysis of Vessel Trains

Authors: Alina P. Colling, Robert G. Hekkenberg

Abstract:

Recent years have seen extensive technological advancement in truck platooning, as reflected in the literature. Its main benefits are the improvement of traffic stability and the reduction of air drag, resulting in less fuel consumption, in comparison to using individual trucks. Platooning is now being adapted to the waterborne transport sector in the NOVIMAR project through the development of a Vessel Train (VT) concept. The main focus of VT’s, as opposed to the truck platoons, is the decrease in manning on board, ultimately working towards autonomous vessel operations. This crew reduction can prove to be an important selling point in achieving economic competitiveness of the waterborne approach when compared to alternative modes of transport. This paper discusses the expected benefits and drawbacks of the VT concept, in terms of the technical logistic performance and generalized costs. More specifically, VT’s can provide flexibility in destination choices for shippers but also add complexity when performing special manoeuvres in VT formation. In order to quantify the cost and performances, a model is developed and simulations are carried out for various case studies. These compare the application of VT’s in the short sea and inland water transport, with specific sailing regimes and technologies installed on board to allow different levels of autonomy. The results enable the identification of the most important boundary conditions for the successful operation of the waterborne platooning concept. These findings serve as a framework for future business applications of the VT.

Keywords: autonomous vessels, NOVIMAR, vessel trains, waterborne platooning

Procedia PDF Downloads 207
306 The Impact of Human Intervention on Net Primary Productivity for the South-Central Zone of Chile

Authors: Yannay Casas-Ledon, Cinthya A. Andrade, Camila E. Salazar, Mauricio Aguayo

Abstract:

The sustainable management of available natural resources is a crucial question for policy-makers, economists, and the research community. Among several, land constitutes one of the most critical resources, which is being intensively appropriated by human activities producing ecological stresses and reducing ecosystem services. In this context, net primary production (NPP) has been considered as a feasible proxy indicator for estimating the impacts of human interventions on land-uses intensity. Accordingly, the human appropriation of NPP (HANPP) was calculated for the south-central regions of Chile between 2007 and 2014. The HANPP was defined as the difference between the potential NPP of the naturally produced vegetation (NPP0, i.e., the vegetation that would exist without any human interferences) and the NPP remaining in the field after harvest (NPPeco), expressed in gC/m² yr. Other NPP flows taken into account in HANPP estimation were the harvested (NPPh) and the losses of NPP through land conversion (NPPluc). The ArcGIS 10.4 software was used for assessing the spatial and temporal HANPP changes. The differentiation of HANPP as % of NPP0 was estimated by each landcover type taken in 2007 and 2014 as the reference years. The spatial results depicted a negative impact on land use efficiency during 2007 and 2014, showing negative HANPP changes for the whole region. The harvest and biomass losses through land conversion components are the leading causes of loss of land-use efficiency. Furthermore, the study depicted higher HANPP in 2014 than in 2007, representing 50% of NPP0 for all landcover classes concerning 2007. This performance was mainly related to the higher volume of harvested biomass for agriculture. In consequence, the cropland depicted the high HANPP followed by plantation. This performance highlights the strong positive correlation between the economic activities developed into the region. This finding constitutes the base for a better understanding of the main driving force influencing biomass productivity and a powerful metric for supporting the sustainable management of land use.

Keywords: human appropriation, land-use changes, land-use impact, net primary productivity

Procedia PDF Downloads 126
305 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers

Authors: Pietro D'Ambrosio, Roberta D'Ambrosio

Abstract:

The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.

Keywords: evaluation, methodology, restoration, reuse

Procedia PDF Downloads 167
304 Dual Duality for Unifying Spacetime and Internal Symmetry

Authors: David C. Ni

Abstract:

The current efforts for Grand Unification Theory (GUT) can be classified into General Relativity, Quantum Mechanics, String Theory and the related formalisms. In the geometric approaches for extending General Relativity, the efforts are establishing global and local invariance embedded into metric formalisms, thereby additional dimensions are constructed for unifying canonical formulations, such as Hamiltonian and Lagrangian formulations. The approaches of extending Quantum Mechanics adopt symmetry principle to formulate algebra-group theories, which evolved from Maxwell formulation to Yang-Mills non-abelian gauge formulation, and thereafter manifested the Standard model. This thread of efforts has been constructing super-symmetry for mapping fermion and boson as well as gluon and graviton. The efforts of String theory currently have been evolving to so-called gauge/gravity correspondence, particularly the equivalence between type IIB string theory compactified on AdS5 × S5 and N = 4 supersymmetric Yang-Mills theory. Other efforts are also adopting cross-breeding approaches of above three formalisms as well as competing formalisms, nevertheless, the related symmetries, dualities, and correspondences are outlined as principles and techniques even these terminologies are defined diversely and often generally coined as duality. In this paper, we firstly classify these dualities from the perspective of physics. Then examine the hierarchical structure of classes from mathematical perspective referring to Coleman-Mandula theorem, Hidden Local Symmetry, Groupoid-Categorization and others. Based on Fundamental Theorems of Algebra, we argue that rather imposing effective constraints on different algebras and the related extensions, which are mainly constructed by self-breeding or self-mapping methodologies for sustaining invariance, we propose a new addition, momentum-angular momentum duality at the level of electromagnetic duality, for rationalizing the duality algebras, and then characterize this duality numerically with attempt for addressing some unsolved problems in physics and astrophysics.

Keywords: general relativity, quantum mechanics, string theory, duality, symmetry, correspondence, algebra, momentum-angular-momentum

Procedia PDF Downloads 387
303 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain

Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA

Abstract:

In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.

Keywords: BER, DWT, extreme leaning machine (ELM), PSNR

Procedia PDF Downloads 299
302 Investigating the Determinants and Growth of Financial Technology Depth of Penetration among the Heterogeneous Africa Economies

Authors: Tochukwu Timothy Okoli, Devi Datt Tewari

Abstract:

The high rate of Fintech adoption has not transmitted to greater financial inclusion and development in Africa. This problem is attributed to poor Fintech diversification and usefulness in the continent. This concept is referred to as the Fintech depth of penetration in this study. The study, therefore, assessed its determinants and growth process in a panel of three emergings, twenty-four frontiers and five fragile African economies disaggregated with dummies over the period 2004-2018 to allow for heterogeneity between groups. The System Generalized Method of Moments (GMM) technique reveals that the average depth of Mobile banking and automated teller machine (ATM) is a dynamic heterogeneity process. Moreover, users' previous experiences/compatibility, trial-ability/income, and financial development were the major factors that raise its usefulness, whereas perceived risk, financial openness, and inflation rate significantly limit its usefulness. The growth rate of Mobile banking, ATM, and Internet banking in 2018 is, on average 41.82, 0.4, and 20.8 per cent respectively greater than its average rates in 2004. These greater averages after the 2009 financial crisis suggest that countries resort to Fintech as a risk-mitigating tool. This study, therefore, recommends greater Fintech diversification through improved literacy, institutional development, financial liberalization, and continuous innovation.

Keywords: depth of fintech, emerging Africa, financial technology, internet banking, mobile banking

Procedia PDF Downloads 118
301 Leveraging Natural Language Processing for Legal Artificial Intelligence: A Longformer Approach for Taiwanese Legal Cases

Authors: Hsin Lee, Hsuan Lee

Abstract:

Legal artificial intelligence (LegalAI) has been increasing applications within legal systems, propelled by advancements in natural language processing (NLP). Compared with general documents, legal case documents are typically long text sequences with intrinsic logical structures. Most existing language models have difficulty understanding the long-distance dependencies between different structures. Another unique challenge is that while the Judiciary of Taiwan has released legal judgments from various levels of courts over the years, there remains a significant obstacle in the lack of labeled datasets. This deficiency makes it difficult to train models with strong generalization capabilities, as well as accurately evaluate model performance. To date, models in Taiwan have yet to be specifically trained on judgment data. Given these challenges, this research proposes a Longformer-based pre-trained language model explicitly devised for retrieving similar judgments in Taiwanese legal documents. This model is trained on a self-constructed dataset, which this research has independently labeled to measure judgment similarities, thereby addressing a void left by the lack of an existing labeled dataset for Taiwanese judgments. This research adopts strategies such as early stopping and gradient clipping to prevent overfitting and manage gradient explosion, respectively, thereby enhancing the model's performance. The model in this research is evaluated using both the dataset and the Average Entropy of Offense-charged Clustering (AEOC) metric, which utilizes the notion of similar case scenarios within the same type of legal cases. Our experimental results illustrate our model's significant advancements in handling similarity comparisons within extensive legal judgments. By enabling more efficient retrieval and analysis of legal case documents, our model holds the potential to facilitate legal research, aid legal decision-making, and contribute to the further development of LegalAI in Taiwan.

Keywords: legal artificial intelligence, computation and language, language model, Taiwanese legal cases

Procedia PDF Downloads 62
300 Towards the Need of Resilient Design and Its Assessment in South China

Authors: Alan Lai, Wilson Yik

Abstract:

With rapid urbanization, there has been a dramatic increase in global urban population in Asia and over half of population in Asia will live in urban regions in the near future. Facing with increasing exposure to climate-related stresses and shocks, most of the Asian cities will very likely to experience more frequent heat waves and flooding with rising sea levels, particularly the coastal cities will grapple for intense typhoons and storm surges. These climate changes have severe impacts in urban areas at the costs of infrastructure and population, for example, human health, wellbeing and high risks of dengue fever, malaria and diarrheal disease. With the increasing prominence of adaptation to climate changes, there have been changes in corresponding policies. Smaller cities have greater potentials for integrating the concept of resilience into their infrastructure as well as keeping pace with their rapid growths in population. It is therefore important to explore the potentials of Asian cities adapting to climate change and the opportunities of building climate resilience in urban planning and building design. Furthermore, previous studies have mainly attempted at exploiting the potential of resilience on a macro-level within urban planning rather than that on micro-level within the context of individual building. The resilience of individual building as a research field has not yet been much explored. Nonetheless, recent studies define that the resilience of an individual building is the one which is able to respond to physical damage and recover from such damage in a quickly and cost-effectively manner, while maintain its primary functions. There is also a need to develop an assessment tool to evaluate the resilience on building scale which is still largely uninvestigated as it should be regarded as a basic function of a building. Due to the lack of literature reporting metric for assessing building resilience with sustainability, the research will be designed as a case study to provide insight into the issue. The aim of this research project is to encourage and assist in developing neighborhood climate resilience design strategies for Hong Kong so as to bridge the gap between difference scales and that between theory and practice.

Keywords: resilience cities, building resilience, resilient buildings and infrastructure, climate resilience, hot and humid southeast area, high-density cities

Procedia PDF Downloads 154
299 An Investigation of Performance Versus Security in Cognitive Radio Networks with Supporting Cloud Platforms

Authors: Kurniawan D. Irianto, Demetres D. Kouvatsos

Abstract:

The growth of wireless devices affects the availability of limited frequencies or spectrum bands as it has been known that spectrum bands are a natural resource that cannot be added. Many studies about available spectrum have been done and it shows that licensed frequencies are idle most of the time. Cognitive radio is one of the solutions to solve those problems. Cognitive radio is a promising technology that allows the unlicensed users known as secondary users (SUs) to access licensed bands without making interference to licensed users or primary users (PUs). As cloud computing has become popular in recent years, cognitive radio networks (CRNs) can be integrated with cloud platform. One of the important issues in CRNs is security. It becomes a problem since CRNs use radio frequencies as a medium for transmitting and CRNs share the same issues with wireless communication systems. Another critical issue in CRNs is performance. Security has adverse effect to performance and there are trade-offs between them. The goal of this paper is to investigate the performance related to security trade-off in CRNs with supporting cloud platforms. Furthermore, Queuing Network Models with preemptive resume and preemptive repeat identical priority are applied in this project to measure the impact of security to performance in CRNs with or without cloud platform. The generalized exponential (GE) type distribution is used to reflect the bursty inter-arrival and service times at the servers. The results show that the best performance is obtained when security is disable and cloud platform is enable.

Keywords: performance vs. security, cognitive radio networks, cloud platforms, GE-type distribution

Procedia PDF Downloads 333
298 Micro-Channel Flows Simulation Based on Nonlinear Coupled Constitutive Model

Authors: Qijiao He

Abstract:

MicroElectrical-Mechanical System (MEMS) is one of the most rapidly developing frontier research field both in theory study and applied technology. Micro-channel is a very important link component of MEMS. With the research and development of MEMS, the size of the micro-devices and the micro-channels becomes further smaller. Compared with the macroscale flow, the flow characteristics of gas in the micro-channel have changed, and the rarefaction effect appears obviously. However, for the rarefied gas and microscale flow, Navier-Stokes-Fourier (NSF) equations are no longer appropriate due to the breakup of the continuum hypothesis. A Nonlinear Coupled Constitutive Model (NCCM) has been derived from the Boltzmann equation to describe the characteristics of both continuum and rarefied gas flows. We apply the present scheme to simulate continuum and rarefied gas flows in a micro-channel structure. And for comparison, we apply other widely used methods which based on particle simulation or direct solution of distribution function, such as Direct simulation of Monte Carlo (DSMC), Unified Gas-Kinetic Scheme (UGKS) and Lattice Boltzmann Method (LBM), to simulate the flows. The results show that the present solution is in better agreement with the experimental data and the DSMC, UGKS and LBM results than the NSF results in rarefied cases but is in good agreement with the NSF results in continuum cases. And some characteristics of both continuum and rarefied gas flows are observed and analyzed.

Keywords: continuum and rarefied gas flows, discontinuous Galerkin method, generalized hydrodynamic equations, numerical simulation

Procedia PDF Downloads 155
297 A Varicella Outbreak in a Highly Vaccinated School Population in Voluntary 2-Dose Era in Beijing, China

Authors: Chengbin Wang, Li Lu, Luodan Suo, Qinghai Wang, Fan Yang, Xu Wang, Mona Marin

Abstract:

Background: Two-dose varicella vaccination has been recommended in Beijing since November 2012. We investigated a varicella outbreak in a highly vaccinated elementary school population to examine transmission patterns and risk factors for vaccine failure. Methods: A varicella case was defined as an acute generalized maculopapulovesicular rash without other apparent cause in a student attending the school from March 28 to May 17, 2015. Breakthrough varicella was defined as varicella >42 days after last vaccine dose. Vaccination information was collected from immunization records. Information on prior disease and clinical presentation was collected via survey of students’ parents. Results: Of the 1056 school students, 1028 (97.3%) reported no varicella history, of whom 364 (35.4%) had received 1-dose and 650 (63.2%) had received 2-dose varicella vaccine, for 98.6% school-wide vaccination coverage with ≥ 1 dose before the outbreak. A total of 20 cases were identified for an overall attack rate of 1.9%. The index case was in a 2-dose vaccinated student who was not isolated. The majority of cases were breakthrough (19/20, 95%) with attack rates of 7.1% (1/14), 1.6% (6/364) and 2.0% (13/650) among unvaccinated, 1-dose, and 2-dose students, respectively. Most cases had < 50 lesions (18/20, 90%). No difference was found between 1-dose and 2-dose breakthrough cases in disease severity or sociodemographic factors. Conclusion: Moderate 2-dose varicella vaccine coverage was insufficient to prevent a varicella outbreak. Two-dose breakthrough varicella is still contagious. High 2-dose varicella vaccine coverage and timely isolation of ill persons might be needed for varicella outbreak control in the 2-dose era.

Keywords: varicella, outbreak, breakthrough varicella, vaccination

Procedia PDF Downloads 317
296 Preliminary Experience in Multiple Green Health Hospital Construction

Authors: Ming-Jyh Chen, Wen-Ming Huang, Yi-Chu Liu, Li-Hui Yang

Abstract:

Introduction: Social responsibility is the key to sustainable organizational development. Under the ground Green Health Hospital Declaration signed by our superintendent, we have launched comprehensive energy conservation management in medical services, the community, and the staff’s life. To execute environment-friendly promotion with robust strategies, we build up a low-carbon medical system and community with smart green public construction promotion as well as intensifying energy conservation education and communication. Purpose/Methods: With the support of the board and the superintendent, we construct an energy management team, commencing with an environment-friendly system, management, education, and ISO 50001 energy management system; we have ameliorated energy performance and energy efficiency and continuing. Results: In the year 2021, we have achieved multiple goals. The energy management system efficiently controls diesel, natural gas, and electricity usage. About 5% of the consumption is saved when compared to the numbers from 2018 and 2021. Our company develops intelligent services and promotes various paperless electronic operations to provide people with a vibrant and environmentally friendly lifestyle. The goal is to save 68.6% on printing and photocopying by reducing 35.15 million sheets of paper yearly. We strengthen the concept of environmental protection classification among colleagues. In the past two years, the amount of resource recycling has reached more than 650 tons, and the resource recycling rate has reached 70%. The annual growth rate of waste recycling is about 28 metric tons. Conclusions: To build a green medical system with “high efficacy, high value, low carbon, low reliance,” energy stewardship, economic prosperity, and social responsibility are our principles when it comes to formulation of energy conservation management strategies, converting limited sources to efficient usage, developing clean energy, and continuing with sustainable energy.

Keywords: energy efficiency, environmental education, green hospital, sustainable development

Procedia PDF Downloads 67
295 Application of Nonparametric Geographically Weighted Regression to Evaluate the Unemployment Rate in East Java

Authors: Sifriyani Sifriyani, I Nyoman Budiantara, Sri Haryatmi, Gunardi Gunardi

Abstract:

East Java Province has a first rank as a province that has the most counties and cities in Indonesia and has the largest population. In 2015, the population reached 38.847.561 million, this figure showed a very high population growth. High population growth is feared to lead to increase the levels of unemployment. In this study, the researchers mapped and modeled the unemployment rate with 6 variables that were supposed to influence. Modeling was done by nonparametric geographically weighted regression methods with truncated spline approach. This method was chosen because spline method is a flexible method, these models tend to look for its own estimation. In this modeling, there were point knots, the point that showed the changes of data. The selection of the optimum point knots was done by selecting the most minimun value of Generalized Cross Validation (GCV). Based on the research, 6 variables were declared to affect the level of unemployment in eastern Java. They were the percentage of population that is educated above high school, the rate of economic growth, the population density, the investment ratio of total labor force, the regional minimum wage and the ratio of the number of big industry and medium scale industry from the work force. The nonparametric geographically weighted regression models with truncated spline approach had a coefficient of determination 98.95% and the value of MSE equal to 0.0047.

Keywords: East Java, nonparametric geographically weighted regression, spatial, spline approach, unemployed rate

Procedia PDF Downloads 307
294 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time

Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar

Abstract:

The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.

Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors

Procedia PDF Downloads 63
293 Effect of Particle Aspect Ratio and Shape Factor on Air Flow inside Pulmonary Region

Authors: Pratibha, Jyoti Kori

Abstract:

Particles in industry, harvesting, coal mines, etc. may not necessarily be spherical in shape. In general, it is difficult to find perfectly spherical particle. The prediction of movement and deposition of non spherical particle in distinct airway generation is much more difficult as compared to spherical particles. Moreover, there is extensive inflexibility in deposition between ducts of a particular generation and inside every alveolar duct since particle concentrations can be much bigger than the mean acinar concentration. Consequently, a large number of particles fail to be exhaled during expiration. This study presents a mathematical model for the movement and deposition of those non-spherical particles by using particle aspect ratio and shape factor. We analyse the pulsatile behavior underneath sinusoidal wall oscillation due to periodic breathing condition through a non-Darcian porous medium or inside pulmonary region. Since the fluid is viscous and Newtonian, the generalized Navier-Stokes equation in two-dimensional coordinate system (r, z) is used with boundary-layer theory. Results are obtained for various values of Reynolds number, Womersley number, Forchsheimer number, particle aspect ratio and shape factor. Numerical computation is done by using finite difference scheme for very fine mesh in MATLAB. It is found that the overall air velocity is significantly increased by changes in aerodynamic diameter, aspect ratio, alveoli size, Reynolds number and the pulse rate; while velocity is decreased by increasing Forchheimer number.

Keywords: deposition, interstitial lung diseases, non-Darcian medium, numerical simulation, shape factor

Procedia PDF Downloads 165
292 Enhanced Tensor Tomographic Reconstruction: Integrating Absorption, Refraction and Temporal Effects

Authors: Lukas Vierus, Thomas Schuster

Abstract:

A general framework is examined for dynamic tensor field tomography within an inhomogeneous medium characterized by refraction and absorption, treated as an inverse source problem concerning the associated transport equation. Guided by Fermat’s principle, the Riemannian metric within the specified domain is determined by the medium's refractive index. While considerable literature exists on the inverse problem of reconstructing a tensor field from its longitudinal ray transform within a static Euclidean environment, limited inversion formulas and algorithms are available for general Riemannian metrics and time-varying tensor fields. It is established that tensor field tomography, akin to an inverse source problem for a transport equation, persists in dynamic scenarios. Framing dynamic tensor tomography as an inverse source problem embodies a comprehensive perspective within this domain. Ensuring well-defined forward mappings necessitates establishing existence and uniqueness for the underlying transport equations. However, the bilinear forms of the associated weak formulations fail to meet the coercivity condition. Consequently, recourse to viscosity solutions is taken, demonstrating their unique existence within suitable Sobolev spaces (in the static case) and Sobolev-Bochner spaces (in the dynamic case), under a specific assumption restricting variations in the refractive index. Notably, the adjoint problem can also be reformulated as a transport equation, with analogous results regarding uniqueness. Analytical solutions are expressed as integrals over geodesics, facilitating more efficient evaluation of forward and adjoint operators compared to solving partial differential equations. Certainly, here's the revised sentence in English: Numerical experiments are conducted using a Nesterov-accelerated Landweber method, encompassing various fields, absorption coefficients, and refractive indices, thereby illustrating the enhanced reconstruction achieved through this holistic modeling approach.

Keywords: attenuated refractive dynamic ray transform of tensor fields, geodesics, transport equation, viscosity solutions

Procedia PDF Downloads 33
291 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.

Keywords: floods, FLIKE, probability distributions, flood frequency, outlier

Procedia PDF Downloads 436
290 A Topology-Based Dynamic Repair Strategy for Enhancing Urban Road Network Resilience under Flooding

Authors: Xuhui Lin, Qiuchen Lu, Yi An, Tao Yang

Abstract:

As global climate change intensifies, extreme weather events such as floods increasingly threaten urban infrastructure, making the vulnerability of urban road networks a pressing issue. Existing static repair strategies fail to adapt to the rapid changes in road network conditions during flood events, leading to inefficient resource allocation and suboptimal recovery. The main research gap lies in the lack of repair strategies that consider both the dynamic characteristics of networks and the progression of flood propagation. This paper proposes a topology-based dynamic repair strategy that adjusts repair priorities based on real-time changes in flood propagation and traffic demand. Specifically, a novel method is developed to assess and enhance the resilience of urban road networks during flood events. The method combines road network topological analysis, flood propagation modelling, and traffic flow simulation, introducing a local importance metric to dynamically evaluate the significance of road segments across different spatial and temporal scales. Using London's road network and rainfall data as a case study, the effectiveness of this dynamic strategy is compared to traditional and Transport for London (TFL) strategies. The most significant highlight of the research is that the dynamic strategy substantially reduced the number of stranded vehicles across different traffic demand periods, improving efficiency by up to 35.2%. The advantage of this method lies in its ability to adapt in real-time to changes in network conditions, enabling more precise resource allocation and more efficient repair processes. This dynamic strategy offers significant value to urban planners, traffic management departments, and emergency response teams, helping them better respond to extreme weather events like floods, enhance overall urban resilience, and reduce economic losses and social impacts.

Keywords: Urban resilience, road networks, flood response, dynamic repair strategy, topological analysis

Procedia PDF Downloads 15
289 Numerical Investigation of a New Two-Fluid Model for Semi-Dilute Polymer Solutions

Authors: Soroush Hooshyar, Mohamadali Masoudian, Natalie Germann

Abstract:

Many soft materials such as polymer solutions can develop localized bands with different shear rates, which are known as shear bands. Using the generalized bracket approach of nonequilibrium thermodynamics, we recently developed a new two-fluid model to study shear banding for semi-dilute polymer solutions. The two-fluid approach is an appropriate means for describing diffusion processes such as Fickian diffusion and stress-induced migration. In this approach, it is assumed that the local gradients in concentration and, if accounted for, also stress generate a nontrivial velocity difference between the components. Since the differential velocity is treated as a state variable in our model, the implementation of the boundary conditions arising from the derivative diffusive terms is straightforward. Our model is a good candidate for benchmark simulations because of its simplicity. We analyzed its behavior in cylindrical Couette flow, a rectilinear channel flow, and a 4:1 planar contraction flow. The latter problem was solved using the OpenFOAM finite volume package and the impact of shear banding on the lip and salient vortices was investigated. For the other smooth geometries, we employed a standard Chebyshev pseudospectral collocation method. The results showed that the steady-state solution is unique with respect to initial conditions, deformation history, and the value of the diffusivity constant. However, smaller the value of the diffusivity constant is, the more time it takes to reach the steady state.

Keywords: nonequilibrium thermodynamics, planar contraction, polymer solutions, shear banding, two-fluid approach

Procedia PDF Downloads 315
288 Foreign Investment, Technological Diffusion and Competiveness of Exports: A Case for Textile Industry in Pakistan

Authors: Syed Toqueer Akhter, Muhammad Awais

Abstract:

Pakistan is a country which is gifted by naturally abundant resources these resources are a pioneer towards a prospect and developed country. Pakistan is the fourth largest exporter of the textile in the world and with the passage of time the competitiveness of these exports is subject to a decline. With a lot of International players in the textile world like China, Bangladesh, India, and Sri Lanka, Pakistan needs to put up a lot of effort to compete with these countries. This research paper would determine the impact of Foreign Direct Investment upon technological diffusion and that how significantly it may be affecting on export performance of the country. It would also demonstrate that with the increase in Foreign Direct Investment, technological diffusion, strong property rights, and using different policy tools, export competitiveness of the country could be improved. The research has been carried out using time series data from 1995 to 2013 and the results have been estimated by using competing Econometrics modes such as Robust regression and Generalized least squares so that to consolidate the impact of the Foreign Investments and Technological diffusion upon export competitiveness comprehensively. Distributed Lag model has also been used to encompass the lagged effect of policy tools variables used by the government. Model estimates entail that 'FDI' and 'Technological Diffusion' do have a significant impact on the competitiveness of the exports of Pakistan. It may also be inferred that competitiveness of Textile Sector requires integrated policy framework, primarily including the reduction in interest rates, providing subsides, and manufacturing of value added products.

Keywords: high technology export, robust regression, patents, technological diffusion, export competitiveness

Procedia PDF Downloads 488
287 Acute Severe Hyponatremia in Patient with Psychogenic Polydipsia, Learning Disability and Epilepsy

Authors: Anisa Suraya Ab Razak, Izza Hayat

Abstract:

Introduction: The diagnosis and management of severe hyponatremia in neuropsychiatric patients present a significant challenge to physicians. Several factors contribute, including diagnostic shadowing and attributing abnormal behavior to intellectual disability or psychiatric conditions. Hyponatraemia is the commonest electrolyte abnormality in the inpatient population, ranging from mild/asymptomatic, moderate to severe levels with life-threatening symptoms such as seizures, coma and death. There are several documented fatal case reports in the literature of severe hyponatremia secondary to psychogenic polydipsia, often diagnosed only in autopsy. This paper presents a case study of acute severe hyponatremia in a neuropsychiatric patient with early diagnosis and admission to intensive care. Case study: A 21-year old Caucasian male with known epilepsy and learning disability was admitted from residential living with generalized tonic-clonic self-terminating seizures after refusing medications for several weeks. Evidence of superficial head injury was detected on physical examination. His laboratory data demonstrated mild hyponatremia (125 mmol/L). Computed tomography imaging of his brain demonstrated no acute bleed or space-occupying lesion. He exhibited abnormal behavior - restlessness, drinking water from bathroom taps, inability to engage, paranoia, and hypersexuality. No collateral history was available to establish his baseline behavior. He was loaded with intravenous sodium valproate and leveritircaetam. Three hours later, he developed vomiting and a generalized tonic-clonic seizure lasting forty seconds. He remained drowsy for several hours and regained minimal recovery of consciousness. A repeat set of blood tests demonstrated profound hyponatremia (117 mmol/L). Outcomes: He was referred to intensive care for peripheral intravenous infusion of 2.7% sodium chloride solution with two-hourly laboratory monitoring of sodium concentration. Laboratory monitoring identified dangerously rapid correction of serum sodium concentration, and hypertonic saline was switched to a 5% dextrose solution to reduce the risk of acute large-volume fluid shifts from the cerebral intracellular compartment to the extracellular compartment. He underwent urethral catheterization and produced 8 liters of urine over 24 hours. Serum sodium concentration remained stable after 24 hours of correction fluids. His GCS recovered to baseline after 48 hours with improvement in behavior -he engaged with healthcare professionals, understood the importance of taking medications, admitted to illicit drug use and drinking massive amounts of water. He was transferred from high-dependency care to ward level and was initiated on multiple trials of anti-epileptics before achieving seizure-free days two weeks after resolution of acute hyponatremia. Conclusion: Psychogenic polydipsia is often found in young patients with intellectual disability or psychiatric disorders. Patients drink large volumes of water daily ranging from ten to forty liters, resulting in acute severe hyponatremia with mortality rates as high as 20%. Poor outcomes are due to challenges faced by physicians in making an early diagnosis and treating acute hyponatremia safely. A low index of suspicion of water intoxication is required in this population, including patients with known epilepsy. Monitoring urine output proved to be clinically effective in aiding diagnosis. Early referral and admission to intensive care should be considered for safe correction of sodium concentration while minimizing risk of fatal complications e.g. central pontine myelinolysis.

Keywords: epilepsy, psychogenic polydipsia, seizure, severe hyponatremia

Procedia PDF Downloads 115