Search results for: applied stochastic model
21929 Establishment of the Regression Uncertainty of the Critical Heat Flux Power Correlation for an Advanced Fuel Bundle
Authors: L. Q. Yuan, J. Yang, A. Siddiqui
Abstract:
A new regression uncertainty analysis methodology was applied to determine the uncertainties of the critical heat flux (CHF) power correlation for an advanced 43-element bundle design, which was developed by Canadian Nuclear Laboratories (CNL) to achieve improved economics, resource utilization and energy sustainability. The new methodology is considered more appropriate than the traditional methodology in the assessment of the experimental uncertainty associated with regressions. The methodology was first assessed using both the Monte Carlo Method (MCM) and the Taylor Series Method (TSM) for a simple linear regression model, and then extended successfully to a non-linear CHF power regression model (CHF power as a function of inlet temperature, outlet pressure and mass flow rate). The regression uncertainty assessed by MCM agrees well with that by TSM. An equation to evaluate the CHF power regression uncertainty was developed and expressed as a function of independent variables that determine the CHF power.Keywords: CHF experiment, CHF correlation, regression uncertainty, Monte Carlo Method, Taylor Series Method
Procedia PDF Downloads 41621928 Influence of the Paint Coating Thickness in Digital Image Correlation Experiments
Authors: Jesús A. Pérez, Sam Coppieters, Dimitri Debruyne
Abstract:
In the past decade, the use of digital image correlation (DIC) techniques has increased significantly in the area of experimental mechanics, especially for materials behavior characterization. This non-contact tool enables full field displacement and strain measurements over a complete region of interest. The DIC algorithm requires a random contrast pattern on the surface of the specimen in order to perform properly. To create this pattern, the specimen is usually first coated using a white matt paint. Next, a black random speckle pattern is applied using any suitable method. If the applied paint coating is too thick, its top surface may not be able to exactly follow the deformation of the specimen, and consequently, the strain measurement might be underestimated. In the present article, a study of the influence of the paint thickness on the strain underestimation is performed for different strain levels. The results are then compared to typical paint coating thicknesses applied by experienced DIC users. A slight strain underestimation was observed for paint coatings thicker than about 30μm. On the other hand, this value was found to be uncommonly high compared to coating thicknesses applied by DIC users.Keywords: digital image correlation, paint coating thickness, strain
Procedia PDF Downloads 51521927 Impact of Modern Beehive on Income of Rural Households: Evidence from Bugina District of Northern Ethiopia
Authors: Wondmnew Derebe Yohannis
Abstract:
The enhanced utilization of modern beehives holds significant potential to enhance the livelihoods of smallholder farmers who heavily rely on mixed crop-livestock farming for their income. Recognizing this, the distribution of improved beehives has been implemented across various regions in Ethiopia, including the Bugina district. However, the precise impact of these improved beehives on farmers' income has received limited attention. To address this gap, this study aims to assess the influence of adopting upgraded beehives on rural households' income and asset accumulation. To conduct this research, survey data was gathered from a sample of 350 households selected through random sampling. The collected data was then analyzed using an econometric stochastic frontier model (ESRM) approach. The findings reveal that the adoption of improved beehives has resulted in higher annual income and asset growth for beekeepers. On average, those who adopted the improved beehives earned approximately 6,077 Ethiopian Birr (ETB) more than their counterparts who did not adopt these beehives. However, it is worth noting that the impact of adoption would have been even greater for non-adopters, as evidenced by the negative transitional heterogeneity effect of 1792 ETB. Furthermore, the analysis indicates that the decision to adopt or not adopt improved beehives was driven by individual self-selection. The adoption of improved beehives also led to an increase in fixed assets for households, establishing it as a viable strategy for poverty reduction. Overall, this study underscores the positive effect of adopting improved beehives on rural households' income and asset holdings, showcasing its potential to uplift smallholder farmers and serve as an alternative mechanism for reducing poverty.Keywords: impact, adoption, endogenous switching regression, income, improved beehives
Procedia PDF Downloads 5421926 Mathematical Model for Interaction Energy of Toroidal Molecules and Other Nanostructures
Authors: Pakhapoom Sarapat, James M. Hill, Duangkamon Baowan
Abstract:
Carbon nanotori provide several properties such as high tensile strength and heat resistance. They are promised to be ideal structures for encapsulation, and their encapsulation ability can be determined by the interaction energy between the carbon nanotori and the encapsulated nanostructures. Such interaction energy is evaluated using Lennard-Jones potential and continuum approximation. Here, four problems relating to toroidal molecules are determined in order to find the most stable configuration. Firstly, the interaction energy between a carbon nanotorus and an atom is examined. The second problem relates to the energy of a fullerene encapsulated inside a carbon nanotorus. Next, the interaction energy between two symmetrically situated and parallel nanotori is considered. Finally, the classical mechanics is applied to model the interaction energy between the toroidal structure of cyclodextrin and the spherical DNA molecules. These mathematical models might be exploited to study a number of promising devices for future developments in bio and nanotechnology.Keywords: carbon nanotori, continuum approximation, interaction energy, Lennard-Jones potential, nanotechnology
Procedia PDF Downloads 14821925 Starlink Satellite Collision Probability Simulation Based on Simplified Geometry Model
Authors: Toby Li, Julian Zhu
Abstract:
In this paper, a model based on a simplified geometry is introduced to give a very conservative collision probability prediction for the Starlink satellite in its most densely clustered region. Under the model in this paper, the probability of collision for Starlink satellite where it clustered most densely is found to be 8.484 ∗ 10^−4. It is found that the predicted collision probability increased nonlinearly with the increased safety distance set. This simple model provides evidence that the continuous development of maneuver avoidance systems is necessary for the future of the orbital safety of satellites under the harsher Lower Earth Orbit environment.Keywords: Starlink, collision probability, debris, geometry model
Procedia PDF Downloads 8321924 Coupled Hydro-Geomechanical Modeling of Oil Reservoir Considering Non-Newtonian Fluid through a Fracture
Authors: Juan Huang, Hugo Ninanya
Abstract:
Oil has been used as a source of energy and supply to make materials, such as asphalt or rubber for many years. This is the reason why new technologies have been implemented through time. However, research still needs to continue increasing due to new challenges engineers face every day, just like unconventional reservoirs. Various numerical methodologies have been applied in petroleum engineering as tools in order to optimize the production of reservoirs before drilling a wellbore, although not all of these have the same efficiency when talking about studying fracture propagation. Analytical methods like those based on linear elastic fractures mechanics fail to give a reasonable prediction when simulating fracture propagation in ductile materials whereas numerical methods based on the cohesive zone method (CZM) allow to represent the elastoplastic behavior in a reservoir based on a constitutive model; therefore, predictions in terms of displacements and pressure will be more reliable. In this work, a hydro-geomechanical coupled model of horizontal wells in fractured rock was developed using ABAQUS; both extended element method and cohesive elements were used to represent predefined fractures in a model (2-D). A power law for representing the rheological behavior of fluid (shear-thinning, power index <1) through fractures and leak-off rate permeating to the matrix was considered. Results have been showed in terms of aperture and length of the fracture, pressure within fracture and fluid loss. It was showed a high infiltration rate to the matrix as power index decreases. A sensitivity analysis is conclusively performed to identify the most influential factor of fluid loss.Keywords: fracture, hydro-geomechanical model, non-Newtonian fluid, numerical analysis, sensitivity analysis
Procedia PDF Downloads 20621923 Modeling User Context Using CEAR Diagram
Authors: Ravindra Dastikop, G. S. Thyagaraju, U. P. Kulkarni
Abstract:
Even though the number of context aware applications is increasing day by day along with the users, till today there is no generic programming paradigm for context aware applications. This situation could be remedied by design and developing the appropriate context modeling and programming paradigm for context aware applications. In this paper, we are proposing the static context model and metrics for validating the expressiveness and understandability of the model. The proposed context modeling is a way of describing a situation of user using context entities , attributes and relationships .The model which is an extended and hybrid version of ER model, ontology model and Graphical model is specifically meant for expressing and understanding the user situation in context aware environment. The model is useful for understanding context aware problems, preparing documentation and designing programs and databases. The model makes use of context entity attributes relationship (CEAR) diagram for representation of association between the context entities and attributes. We have identified a new set of graphical notations for improving the expressiveness and understandability of context from the end user perspective .Keywords: user context, context entity, context entity attributes, situation, sensors, devices, relationships, actors, expressiveness, understandability
Procedia PDF Downloads 34421922 Spatially Downscaling Land Surface Temperature with a Non-Linear Model
Authors: Kai Liu
Abstract:
Remote sensing-derived land surface temperature (LST) can provide an indication of the temporal and spatial patterns of surface evapotranspiration (ET). However, the spatial resolution achieved by existing commonly satellite products is ~1 km, which remains too coarse for ET estimations. This paper proposed a model that can disaggregate coarse resolution MODIS LST at 1 km scale to fine spatial resolutions at the scale of 250 m. Our approach attempted to weaken the impacts of soil moisture and growing statues on LST variations. The proposed model spatially disaggregates the coarse thermal data by using a non-linear model involving Bowen ratio, normalized difference vegetation index (NDVI) and photochemical reflectance index (PRI). This LST disaggregation model was tested on two heterogeneous landscapes in central Iowa, USA and Heihe River, China, during the growing seasons. Statistical results demonstrated that our model achieved better than the two classical methods (DisTrad and TsHARP). Furthermore, using the surface energy balance model, it was observed that the estimated ETs using the disaggregated LST from our model were more accurate than those using the disaggregated LST from DisTrad and TsHARP.Keywords: Bowen ration, downscaling, evapotranspiration, land surface temperature
Procedia PDF Downloads 32921921 Fuzzy Set Qualitative Comparative Analysis in Business Models' Study
Authors: K. Debkowska
Abstract:
The aim of this article is presenting the possibilities of using Fuzzy Set Qualitative Comparative Analysis (fsQCA) in researches concerning business models of enterprises. FsQCA is a bridge between quantitative and qualitative researches. It's potential can be used in analysis and evaluation of business models. The article presents the results of a study conducted on the basis of enterprises belonging to different sectors: transport and logistics, industry, building construction, and trade. The enterprises have been researched taking into account the components of business models and the financial condition of companies. Business models are areas of complex and heterogeneous nature. The use of fsQCA has enabled to answer the following question: which components of a business model and in which configuration influence better financial condition of enterprises. The analysis has been performed separately for particular sectors. This enabled to compare the combinations of business models' components which actively influence the financial condition of enterprises in analyzed sectors. The following components of business models were analyzed for the purposes of the study: Key Partners, Key Activities, Key Resources, Value Proposition, Channels, Cost Structure, Revenue Streams, Customer Segment and Customer Relationships. These components of the study constituted the variables shaping the financial results of enterprises. The results of the study lead us to believe that fsQCA can help in analyzing and evaluating a business model, which is important in terms of making a business decision about the business model used or its change. In addition, results obtained by fsQCA can be applied by all stakeholders connected with the company.Keywords: business models, components of business models, data analysis, fsQCA
Procedia PDF Downloads 17121920 Thermal-Mechanical Analysis of a Bridge Deck to Determine Residual Weld Stresses
Authors: Evy Van Puymbroeck, Wim Nagy, Ken Schotte, Heng Fang, Hans De Backer
Abstract:
The knowledge of residual stresses for welded bridge components is essential to determine the effect of the residual stresses on the fatigue life behavior. The residual stresses of an orthotropic bridge deck are determined by simulating the welding process with finite element modelling. The stiffener is placed on top of the deck plate before welding. A chained thermal-mechanical analysis is set up to determine the distribution of residual stresses for the bridge deck. First, a thermal analysis is used to determine the temperatures of the orthotropic deck for different time steps during the welding process. Twin wire submerged arc welding is used to construct the orthotropic plate. A double ellipsoidal volume heat source model is used to describe the heat flow through a material for a moving heat source. The heat input is used to determine the heat flux which is applied as a thermal load during the thermal analysis. The heat flux for each element is calculated for different time steps to simulate the passage of the welding torch with the considered welding speed. This results in a time dependent heat flux that is applied as a thermal loading. Thermal material behavior is specified by assigning the properties of the material in function of the high temperatures during welding. Isotropic hardening behavior is included in the model. The thermal analysis simulates the heat introduced in the two plates of the orthotropic deck and calculates the temperatures during the welding process. After the calculation of the temperatures introduced during the welding process in the thermal analysis, a subsequent mechanical analysis is performed. For the boundary conditions of the mechanical analysis, the actual welding conditions are considered. Before welding, the stiffener is connected to the deck plate by using tack welds. These tack welds are implemented in the model. The deck plate is allowed to expand freely in an upwards direction while it rests on a firm and flat surface. This behavior is modelled by using grounded springs. Furthermore, symmetry points and lines are used to prevent the model to move freely in other directions. In the thermal analysis, a mechanical material model is used. The calculated temperatures during the thermal analysis are introduced during the mechanical analysis as a time dependent load. The connection of the elements of the two plates in the fusion zone is realized with a glued connection which is activated when the welding temperature is reached. The mechanical analysis results in a distribution of the residual stresses. The distribution of the residual stresses of the orthotropic bridge deck is compared with results from literature. Literature proposes uniform tensile yield stresses in the weld while the finite element modelling showed tensile yield stresses at a short distance from the weld root or the weld toe. The chained thermal-mechanical analysis results in a distribution of residual weld stresses for an orthotropic bridge deck. In future research, the effect of these residual stresses on the fatigue life behavior of welded bridge components can be studied.Keywords: finite element modelling, residual stresses, thermal-mechanical analysis, welding simulation
Procedia PDF Downloads 17121919 On Supporting a Meta-Design Approach in Socio-Technical Ontology Engineering
Authors: Mesnan Silalahi, Dana Indra Sensuse, Indra Budi
Abstract:
Many research have revealed the fact of the complexity of ontology building process that there is a need to have a new approach which addresses the socio-technical aspects in the collaboration to reach a consensus. Meta-design approach is considered applicable as a method in the methodological model in a socio-technical ontology engineering. Principles in the meta-design framework is applied in the construction phases on the ontology. A portal is developed to support the meta-design principles requirements. To validate the methodological model semantic web applications were developed and integrated in the portal and also used as a way to show the usefulness of the ontology. The knowledge based system will be filled with data of Indonesian medicinal plants. By showing the usefulness of the developed ontology in a web semantic application, we motivate all stakeholders to participate in the development of knowledge based system of medicinal plants in Indonesia.Keywords: socio-technical, metadesign, ontology engineering methodology, semantic web application
Procedia PDF Downloads 43821918 Selection of Pichia kudriavzevii Strain for the Production of Single-Cell Protein from Cassava Processing Waste
Authors: Phakamas Rachamontree, Theerawut Phusantisampan, Natthakorn Woravutthikul, Peerapong Pornwongthong, Malinee Sriariyanun
Abstract:
A total of 115 yeast strains isolated from local cassava processing wastes were measured for crude protein content. Among these strains, the strain MSY-2 possessed the highest protein concentration (>3.5 mg protein/mL). By using molecular identification tools, it was identified to be a strain of Pichia kudriavzevii based on similarity of D1/D2 domain of 26S rDNA region. In this study, to optimize the protein production by MSY-2 strain, Response Surface Methodology (RSM) was applied. The tested parameters were the carbon content, nitrogen content, and incubation time. Here, the value of regression coefficient (R2) = 0.7194 could be explained by the model, which is high to support the significance of the model. Under the optimal condition, the protein content was produced up to 3.77 g per L of the culture and MSY-2 strain contain 66.8 g protein per 100 g of cell dry weight. These results revealed the plausibility of applying the novel strain of yeast in single-cell protein production.Keywords: single cell protein, response surface methodology, yeast, cassava processing waste
Procedia PDF Downloads 40321917 Sensitivity Analysis of Movable Bed Roughness Formula in Sandy Rivers
Authors: Mehdi Fuladipanah
Abstract:
Sensitivity analysis as a technique is applied to determine influential input factors on model output. Variance-based sensitivity analysis method has more application compared to other methods because of including linear and non-linear models. In this paper, van Rijn’s movable bed roughness formula was selected to evaluate because of its reasonable results in sandy rivers. This equation contains four variables as: flow depth, sediment size,bBed form height and bed form length. These variable’s importance was determined using the first order of Fourier Amplitude Sensitivity Test. Sensitivity index was applied to evaluate importance of factors. The first order FAST based sensitivity indices test, explain 90% of the total variance that is indicating acceptance criteria of FAST application. More value of this index is indicating more important variable. Results show that bed form height, bed form length, sediment size and flow depth are more influential factors with sensitivity index: 32%, 24%, 19% and 15% respectively.Keywords: sdensitivity analysis, variance, movable bed roughness formula, Sandy River
Procedia PDF Downloads 26121916 A Strategic Partner Evaluation Model for the Project Based Enterprises
Authors: Woosik Jang, Seung H. Han
Abstract:
The optimal partner selection is one of the most important factors to pursue the project’s success. However, in practice, there is a gaps in perception of success depending on the role of the enterprises for the projects. This frequently makes a relations between the partner evaluation results and the project’s final performances, insufficiently. To meet this challenges, this study proposes a strategic partner evaluation model considering the perception gaps between enterprises. A total 3 times of survey was performed; factor selection, perception gap analysis, and case application. After then total 8 factors are extracted from independent sample t-test and Borich model to set-up the evaluation model. Finally, through the case applications, only 16 enterprises are re-evaluated to “Good” grade among the 22 “Good” grade from existing model. On the contrary, 12 enterprises are re-evaluated to “Good” grade among the 19 “Bad” grade from existing model. Consequently, the perception gaps based evaluation model is expected to improve the decision making quality and also enhance the probability of project’s success.Keywords: partner evaluation model, project based enterprise, decision making, perception gap, project performance
Procedia PDF Downloads 15721915 Application of Artificial Neural Network for Prediction of Retention Times of Some Secoestrane Derivatives
Authors: Nataša Kalajdžija, Strahinja Kovačević, Davor Lončar, Sanja Podunavac Kuzmanović, Lidija Jevrić
Abstract:
In order to investigate the relationship between retention and structure, a quantitative Structure Retention Relationships (QSRRs) study was applied for the prediction of retention times of a set of 23 secoestrane derivatives in a reversed-phase thin-layer chromatography. After the calculation of molecular descriptors, a suitable set of molecular descriptors was selected by using step-wise multiple linear regressions. Artificial Neural Network (ANN) method was employed to model the nonlinear structure-activity relationships. The ANN technique resulted in 5-6-1 ANN model with the correlation coefficient of 0.98. We found that the following descriptors: Critical pressure, total energy, protease inhibition, distribution coefficient (LogD) and parameter of lipophilicity (miLogP) have a significant effect on the retention times. The prediction results are in very good agreement with the experimental ones. This approach provided a new and effective method for predicting the chromatographic retention index for the secoestrane derivatives investigated.Keywords: lipophilicity, QSRR, RP TLC retention, secoestranes
Procedia PDF Downloads 45621914 A Super-Efficiency Model for Evaluating Efficiency in the Presence of Time Lag Effect
Authors: Yanshuang Zhang, Byungho Jeong
Abstract:
In many cases, there is a time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in evaluating the performance of organizations. Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. Multi-periods input(MpI) and Multi-periods output(MpO) models are integrated models to calculate simple efficiency considering time lag effect. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. That is, efficient DMUs can’t be discriminated because their efficiency scores are same. Thus, this paper suggests a super-efficiency model for efficiency evaluation under the consideration of time lag effect based on the MpO model. A case example using a long-term research project is given to compare the suggested model with the MpO model.Keywords: DEA, super-efficiency, time lag, multi-periods input
Procedia PDF Downloads 47421913 The Social Model of Disability and Disability Rights: Defending a Conceptual Alignment between the Social Model’s Concept of Disability and the Nature of Rights and Duties
Authors: Adi Goldiner
Abstract:
Historically, the social model of disability has played a pivotal role in bringing rights discourse into the disability debate. Against this backdrop, the paper explores the conceptual alignment between the social model’s account of disability and the nature of rights. Specifically, the paper examines the possibility that the social model conceptualizes disability in a way that aligns with the nature of rights and thus motivates the invocation of disability rights. Methodologically, the paper juxtaposes the literature on the social model of disability, primarily the work of the Union of the Physically Impaired Against Segregation in the UK and related scholarship, with theories of moral rights. By focusing on the interplay between the social model of disability and rights, the paper provides a conceptual explanation for the rise of disability rights. In addition, the paper sheds light on the nature of rights, their function and limitations, in the context of disability rights. The paper concludes that the social model’s conceptualization of disability is hospitable to rights, because it opens up the possibility that there are duties that correlate with disability rights. Under the social model, disability is a condition that can be eliminated by the removal of social, structural, and attitudinal barriers. Accordingly, the social model dispels the idea that the actions of others towards disabled people will have a marginal impact on their interests in not being disabled. Equally important, the social model refutes the idea that in order to significantly serve people's interest in not being disabled, it is necessary to cure bodily impairments, which is not always possible. As rights correlate with duties that are possible to comply with, as well as those that significantly serve the interests of the right holders, the social model’s conceptualization of disability invites the reframing of problems related to disability in terms of infringements of disability rights. A possible objection to the paper’s argument is raised, according to which the social model is at odds with the invocation of disability rights because disability rights are ineffective in realizing the social model's goal of improving the lives of disabled by eliminating disability. The paper responds to this objection by drawing a distinction between ‘moral rights,’ which, conceptually, are not subject to criticism of ineffectiveness, and ‘legal rights’ which are.Keywords: disability rights, duties, moral rights, social model
Procedia PDF Downloads 40421912 Rheometer Enabled Study of Tissue/biomaterial Frequency-Dependent Properties
Authors: Polina Prokopovich
Abstract:
Despite the well-established dependence of cartilage mechanical properties on the frequency of the applied load, most research in the field is carried out in either load-free or constant load conditions because of the complexity of the equipment required for the determination of time-dependent properties. These simpler analyses provide a limited representation of cartilage properties thus greatly reducing the impact of the information gathered hindering the understanding of the mechanisms involved in this tissue replacement, development and pathology. More complex techniques could represent better investigative methods, but their uptake in cartilage research is limited by the highly specialised training required and cost of the equipment. There is, therefore, a clear need for alternative experimental approaches to cartilage testing to be deployed in research and clinical settings using more user-friendly and financial accessible devices. Frequency dependent material properties can be determined through rheometry that is an easy to use requiring a relatively inexpensive device; we present how a commercial rheometer can be adapted to determine the viscoelastic properties of articular cartilage. Frequency-sweep tests were run at various applied normal loads on immature, mature and trypsinased (as model of osteoarthritis) cartilage samples to determine the dynamic shear moduli (G*, G′ G″) of the tissues. Moduli increased with increasing frequency and applied load; mature cartilage had generally the highest moduli and GAG depleted samples the lowest. Hydraulic permeability (KH) was estimated from the rheological data and decreased with applied load; GAG depleted cartilage exhibited higher hydraulic permeability than either immature or mature tissues. The rheometer-based methodology developed was validated by the close comparison of the rheometer-obtained cartilage characteristics (G*, G′, G″, KH) with results obtained with more complex testing techniques available in literature. Rheometry is relatively simpler and does not require highly capital intensive machinery and staff training is more accessible; thus the use of a rheometer would represent a cost-effective approach for the determination of frequency-dependent properties of cartilage for more comprehensive and impactful results for both healthcare professional and R&D.Keywords: tissue, rheometer, biomaterial, cartilage
Procedia PDF Downloads 8121911 A New Fuzzy Fractional Order Model of Transmission of Covid-19 With Quarantine Class
Authors: Asma Hanif, A. I. K. Butt, Shabir Ahmad, Rahim Ud Din, Mustafa Inc
Abstract:
This paper is devoted to a study of the fuzzy fractional mathematical model reviewing the transmission dynamics of the infectious disease Covid-19. The proposed dynamical model consists of susceptible, exposed, symptomatic, asymptomatic, quarantine, hospitalized and recovered compartments. In this study, we deal with the fuzzy fractional model defined in Caputo’s sense. We show the positivity of state variables that all the state variables that represent different compartments of the model are positive. Using Gronwall inequality, we show that the solution of the model is bounded. Using the notion of the next-generation matrix, we find the basic reproduction number of the model. We demonstrate the local and global stability of the equilibrium point by using the concept of Castillo-Chavez and Lyapunov theory with the Lasalle invariant principle, respectively. We present the results that reveal the existence and uniqueness of the solution of the considered model through the fixed point theorem of Schauder and Banach. Using the fuzzy hybrid Laplace method, we acquire the approximate solution of the proposed model. The results are graphically presented via MATLAB-17.Keywords: Caputo fractional derivative, existence and uniqueness, gronwall inequality, Lyapunov theory
Procedia PDF Downloads 10521910 Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan
Authors: April Hueimin Lu, Liangj-Ju Yao, Jun-Tin Lin, Susan Siru Liu
Abstract:
This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage.Keywords: aerial survey, 3D scanning, aboriginal settlement, settlement architecture cluster, ecological landscape area, old Paiwan settlements, slat house, photogrammetry, SfM, MVS), Point cloud, SIFT, DSM, 3D model
Procedia PDF Downloads 16921909 An Integrated Real-Time Hydrodynamic and Coastal Risk Assessment Model
Authors: M. Reza Hashemi, Chris Small, Scott Hayward
Abstract:
The Northeast Coast of the US faces damaging effects of coastal flooding and winds due to Atlantic tropical and extratropical storms each year. Historically, several large storm events have produced substantial levels of damage to the region; most notably of which were the Great Atlantic Hurricane of 1938, Hurricane Carol, Hurricane Bob, and recently Hurricane Sandy (2012). The objective of this study was to develop an integrated modeling system that could be used as a forecasting/hindcasting tool to evaluate and communicate the risk coastal communities face from these coastal storms. This modeling system utilizes the ADvanced CIRCulation (ADCIRC) model for storm surge predictions and the Simulating Waves Nearshore (SWAN) model for the wave environment. These models were coupled, passing information to each other and computing over the same unstructured domain, allowing for the most accurate representation of the physical storm processes. The coupled SWAN-ADCIRC model was validated and has been set up to perform real-time forecast simulations (as well as hindcast). Modeled storm parameters were then passed to a coastal risk assessment tool. This tool, which is generic and universally applicable, generates spatial structural damage estimate maps on an individual structure basis for an area of interest. The required inputs for the coastal risk model included a detailed information about the individual structures, inundation levels, and wave heights for the selected region. Additionally, calculation of wind damage to structures was incorporated. The integrated coastal risk assessment system was then tested and applied to Charlestown, a small vulnerable coastal town along the southern shore of Rhode Island. The modeling system was applied to Hurricane Sandy and a synthetic storm. In both storm cases, effect of natural dunes on coastal risk was investigated. The resulting damage maps for the area (Charlestown) clearly showed that the dune eroded scenarios affected more structures, and increased the estimated damage. The system was also tested in forecast mode for a large Nor’Easters: Stella (March 2017). The results showed a good performance of the coupled model in forecast mode when compared to observations. Finally, a nearshore model XBeach was then nested within this regional grid (ADCIRC-SWAN) to simulate nearshore sediment transport processes and coastal erosion. Hurricane Irene (2011) was used to validate XBeach, on the basis of a unique beach profile dataset at the region. XBeach showed a relatively good performance, being able to estimate eroded volumes along the beach transects with a mean error of 16%. The validated model was then used to analyze the effectiveness of several erosion mitigation methods that were recommended in a recent study of coastal erosion in New England: beach nourishment, coastal bank (engineered core), and submerged breakwater as well as artificial surfing reef. It was shown that beach nourishment and coastal banks perform better to mitigate shoreline retreat and coastal erosion.Keywords: ADCIRC, coastal flooding, storm surge, coastal risk assessment, living shorelines
Procedia PDF Downloads 11621908 A New Car-Following Model with Consideration of the Brake Light
Authors: Zhiyuan Tang, Ju Zhang, Wenyuan Wu
Abstract:
In this research, a car-following model with consideration of the status of the brake light is proposed. The numerical results show that the stability of the traffic flow is improved. The ability of the brake light to reduce car accident is also showed.Keywords: brake light, car-following model, traffic flow, regional planning, transportation
Procedia PDF Downloads 57921907 Collision Avoidance Based on Model Predictive Control for Nonlinear Octocopter Model
Authors: Doğan Yıldız, Aydan Müşerref Erkmen
Abstract:
The controller of the octocopter is mostly based on the PID controller. For complex maneuvers, PID controllers have limited performance capability like in collision avoidance. When an octocopter needs avoidance from an obstacle, it must instantly show an agile maneuver. Also, this kind of maneuver is affected severely by the nonlinear characteristic of octocopter. When these kinds of limitations are considered, the situation is highly challenging for the PID controller. In the proposed study, these challenges are tried to minimize by using the model predictive controller (MPC) for collision avoidance with a nonlinear octocopter model. The aim is to show that MPC-based collision avoidance has the capability to deal with fast varying conditions in case of obstacle detection and diminish the nonlinear effects of octocopter with varying disturbances.Keywords: model predictive control, nonlinear octocopter model, collision avoidance, obstacle detection
Procedia PDF Downloads 19121906 Fair Value Accounting and Evolution of the Ohlson Model
Authors: Mohamed Zaher Bouaziz
Abstract:
Our study examines the Ohlson Model, which links a company's market value to its equity and net earnings, in the context of the evolution of the Canadian accounting model, characterized by more extensive use of fair value and a broader measure of performance after IFRS adoption. Our hypothesis is that if equity is reported at its fair value, this valuation is closely linked to market capitalization, so the weight of earnings weakens or even disappears in the Ohlson Model. Drawing on Canada's adoption of the International Financial Reporting Standards (IFRS), our results support our hypothesis that equity appears to include most of the relevant information for investors, while earnings have become less important. However, the predictive power of earnings does not disappear.Keywords: fair value accounting, Ohlson model, IFRS adoption, value-relevance of equity and earnings
Procedia PDF Downloads 18921905 Small Text Extraction from Documents and Chart Images
Authors: Rominkumar Busa, Shahira K. C., Lijiya A.
Abstract:
Text recognition is an important area in computer vision which deals with detecting and recognising text from an image. The Optical Character Recognition (OCR) is a saturated area these days and with very good text recognition accuracy. However the same OCR methods when applied on text with small font sizes like the text data of chart images, the recognition rate is less than 30%. In this work, aims to extract small text in images using the deep learning model, CRNN with CTC loss. The text recognition accuracy is found to improve by applying image enhancement by super resolution prior to CRNN model. We also observe the text recognition rate further increases by 18% by applying the proposed method, which involves super resolution and character segmentation followed by CRNN with CTC loss. The efficiency of the proposed method shows that further pre-processing on chart image text and other small text images will improve the accuracy further, thereby helping text extraction from chart images.Keywords: small text extraction, OCR, scene text recognition, CRNN
Procedia PDF Downloads 12521904 Modeling of a Small Unmanned Aerial Vehicle
Authors: Ahmed Elsayed Ahmed, Ashraf Hafez, A. N. Ouda, Hossam Eldin Hussein Ahmed, Hala Mohamed ABD-Elkader
Abstract:
Unmanned Aircraft Systems (UAS) are playing increasingly prominent roles in defense programs and defense strategies around the world. Technology advancements have enabled the development of it to do many excellent jobs as reconnaissance, surveillance, battle fighters, and communications relays. Simulating a small unmanned aerial vehicle (SUAV) dynamics and analyzing its behavior at the preflight stage is too important and more efficient. The first step in the UAV design is the mathematical modeling of the nonlinear equations of motion. In this paper, a survey with a standard method to obtain the full non-linear equations of motion is utilized,and then the linearization of the equations according to a steady state flight condition (trimming) is derived. This modeling technique is applied to an Ultrastick-25e fixed wing UAV to obtain the valued linear longitudinal and lateral models. At the end, the model is checked by matching between the behavior of the states of the non-linear UAV and the resulted linear model with doublet at the control surfaces.Keywords: UAV, equations of motion, modeling, linearization
Procedia PDF Downloads 74321903 A Constitutive Model of Ligaments and Tendons Accounting for Fiber-Matrix Interaction
Authors: Ratchada Sopakayang, Gerhard A. Holzapfel
Abstract:
In this study, a new constitutive model is developed to describe the hyperelastic behavior of collagenous tissues with a parallel arrangement of collagen fibers such as ligaments and tendons. The model is formulated using a continuum approach incorporating the structural changes of the main tissue components: collagen fibers, proteoglycan-rich matrix and fiber-matrix interaction. The mechanical contribution of the interaction between the fibers and the matrix is simply expressed by a coupling term. The structural change of the collagen fibers is incorporated in the constitutive model to describe the activation of the fibers under tissue straining. Finally, the constitutive model can easily describe the stress-stretch nonlinearity which occurs when a ligament/tendon is axially stretched. This study shows that the interaction between the fibers and the matrix contributes to the mechanical tissue response. Therefore, the model may lead to a better understanding of the physiological mechanisms of ligaments and tendons under axial loading.Keywords: constitutive model, fiber-matrix, hyperelasticity, interaction, ligament, tendon
Procedia PDF Downloads 29921902 Countercyclical Capital Buffer in the Polish Banking System
Authors: Mateusz Mokrogulski, Piotr Śliwka
Abstract:
The aim of this paper is the identification of periods of excessive credit growth in the Polish banking sector in years 2007-2014 using different methodologies. Due to the lack of precise guidance in CRD IV regarding methods of calculating the credit gap and related deviations from the long-term trends, a few filtering methods are applied, e.g. Hodrick-Prescott and Baxter-King. The solutions based on the switching model are also proposed. The next step represent computations of both the credit gap, and the counter cyclical capital buffer (CCB) rates on a quarterly basis. The calculations are carried out for the entire banking sector in Poland, as well as for its components (commercial and co-operative banks), and different types of loans. The calculations show vividly that in the analysed period there were the times of excessive credit growth. However, the results are different for the above mentioned sub-sectors. Of paramount importance here are mortgage loans, where the outcomes are distorted by high exchange rate fluctuations. The research on the CCB is now going to gain popularity as the buffer will soon become one of the tools of the macro prudential policy under CRD IV. Although the presented method is focused on the Polish banking sector, it can also be applied to other member states. Especially to the Central and Eastern European countries, that are usually characterized by smaller banking sectors compared to EU-15.Keywords: countercyclical capital buffer, CRD IV, filtering methods, mortgage loans
Procedia PDF Downloads 32221901 Approach to Study the Workability of Concrete with the Fractal Model
Authors: Achouri Fatima, Chouicha Kaddour
Abstract:
The main parameters affecting the workability are the water content, particle size, and the total surface of the grains, as long as the mixing water begins by wetting the surface of the grains and then fills the voids between the grains to form entrapped water, the quantity of water remaining is called free water. The aim is to undertake a fractal approach through the relationship between the concrete formulation parameters and workability, to develop this approach a series of concrete taken from the literature was investigated by varying formulation parameters such as G / S, the quantity of cement C and the quantity of mixing water E. We also call on other model as the model for the thickness of the water layer and model of the thickness of the paste layer to judge their relevance, hence the following results : the relevance of the model of the thickness of the water layer is considered relevant when there is a variation in the water quantity, the model of the thickness of the layer of the paste is only applicable if we consider that the paste is made with the grain value Dmax = 2.85: value from which we see a stable model.Keywords: concrete, fractal method, paste thickness, water thickness, workability
Procedia PDF Downloads 37921900 Plant Leaf Recognition Using Deep Learning
Authors: Aadhya Kaul, Gautam Manocha, Preeti Nagrath
Abstract:
Our environment comprises of a wide variety of plants that are similar to each other and sometimes the similarity between the plants makes the identification process tedious thus increasing the workload of the botanist all over the world. Now all the botanists cannot be accessible all the time for such laborious plant identification; therefore, there is an urge for a quick classification model. Also, along with the identification of the plants, it is also necessary to classify the plant as healthy or not as for a good lifestyle, humans require good food and this food comes from healthy plants. A large number of techniques have been applied to classify the plants as healthy or diseased in order to provide the solution. This paper proposes one such method known as anomaly detection using autoencoders using a set of collections of leaves. In this method, an autoencoder model is built using Keras and then the reconstruction of the original images of the leaves is done and the threshold loss is found in order to classify the plant leaves as healthy or diseased. A dataset of plant leaves is considered to judge the reconstructed performance by convolutional autoencoders and the average accuracy obtained is 71.55% for the purpose.Keywords: convolutional autoencoder, anomaly detection, web application, FLASK
Procedia PDF Downloads 163