Search results for: equation model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17829

Search results for: equation model

10449 Design and Development of an Autonomous Beach Cleaning Vehicle

Authors: Mahdi Allaoua Seklab, Süleyman BaşTürk

Abstract:

In the quest to enhance coastal environmental health, this study introduces a fully autonomous beach cleaning machine, a breakthrough in leveraging green energy and advanced artificial intelligence for ecological preservation. Designed to operate independently, the machine is propelled by a solar-powered system, underscoring a commitment to sustainability and the use of renewable energy in autonomous robotics. The vehicle's autonomous navigation is achieved through a sophisticated integration of LIDAR and a camera system, utilizing an SSD MobileNet V2 object detection model for accurate and real-time trash identification. The SSD framework, renowned for its efficiency in detecting objects in various scenarios, is coupled with the lightweight and precise highly MobileNet V2 architecture, making it particularly suited for the computational constraints of on-board processing in mobile robotics. Training of the SSD MobileNet V2 model was conducted on Google Colab, harnessing cloud-based GPU resources to facilitate a rapid and cost-effective learning process. The model was refined with an extensive dataset of annotated beach debris, optimizing the parameters using the Adam optimizer and a cross-entropy loss function to achieve high-precision trash detection. This capability allows the machine to intelligently categorize and target waste, leading to more effective cleaning operations. This paper details the design and functionality of the beach cleaning machine, emphasizing its autonomous operational capabilities and the novel application of AI in environmental robotics. The results showcase the potential of such technology to fill existing gaps in beach maintenance, offering a scalable and eco-friendly solution to the growing problem of coastal pollution. The deployment of this machine represents a significant advancement in the field, setting a new standard for the integration of autonomous systems in the service of environmental stewardship.

Keywords: autonomous beach cleaning machine, renewable energy systems, coastal management, environmental robotics

Procedia PDF Downloads 27
10448 Cakrawala Baca Transformation Model into Social Enterprise: A Benchmark Approach from Socentra Agro Mandiri (SAM) and Agritektur

Authors: Syafinatul Fitri

Abstract:

Cakrawala Baca is one of social organization in Indonesia that realize to transform its organization into social enterprise to create more sustainable organization that result more sustainable social impact. Cakrawala Baca implements voluntary system for its organization and it has passive social target. It funds its program by several fund rising activities that depend on donors or sponsor. Therefore social activity that held does not create sustainable social impact. It is different with social enterprise that usually more independent in funding its activity through social business and implement active social target and professional work for organization member. Therefore social enterprise can sustain its organization and then able to create sustainable social impact. Developing transformation model from social movement into social enterprise is the focus of this study. To achieve the aim of study, benchmark approach from successful social enterprise in Indonesia that has previously formed as social movement is employed. The benchmark is conducted through internal and external scanning that result the understanding of how they transformed into social enterprise. After understanding SAM and Agritektur transformation, transformation pattern is formulated based on their transformation similarities. This transformation pattern will be implemented to formulate the transformation plan for Cakrawala Baca to be a social enterprise.

Keywords: social movement/social organization, non-profit organization (NPO), social enterprise, transformation, Benchmarks approach

Procedia PDF Downloads 509
10447 Real Time Traffic Performance Study over MPLS VPNs with DiffServ

Authors: Naveed Ghani

Abstract:

With the arrival of higher speed communication links and mature application running over the internet, the requirement for reliable, efficient and robust network designs rising day by day. Multi-Protocol Label Switching technology (MPLS) Virtual Private Networks (VPNs) have committed to provide optimal network services. They are gaining popularity in industry day by day. Enterprise customers are moving to service providers that offer MPLS VPNs. The main reason for this shifting is the capability of MPLS VPN to provide built in security features and any-to-any connectivity. MPLS VPNs improved the network performance due to fast label switching as compare to traditional IP Forwarding but traffic classification and policing was still required on per hop basis to enhance the performance of real time traffic which is delay sensitive (particularly voice and video). QoS (Quality of service) is the most important factor to prioritize enterprise networks’ real time traffic such as voice and video. This thesis is focused on the study of QoS parameters (e.g. delay, jitter and MOS (Mean Opinion Score)) for the real time traffic over MPLS VPNs. DiffServ (Differentiated Services) QoS model will be used over MPLS VPN network to get end-to-end service quality.

Keywords: network, MPLS, VPN, DiffServ, MPLS VPN, DiffServ QoS, QoS Model, GNS2

Procedia PDF Downloads 426
10446 Improvement of Model for SIMMER Code for SFR Corium Relocation Studies

Authors: A. Bachrata, N. Marie, F. Bertrand, J. B. Droin

Abstract:

The in-depth understanding of severe accident propagation in Generation IV of nuclear reactors is important so that appropriate risk management can be undertaken early in their design process. This paper is focused on model improvements in the SIMMER code in order to perform studies of severe accident mitigation of Sodium Fast Reactor. During the design process of the mitigation devices dedicated to extraction of molten fuel from the core region, the molten fuel propagation from the core up to the core catcher has to be studied. In this aim, analytical as well as the complex thermo-hydraulic simulations with SIMMER-III code are performed. The studies presented in this paper focus on physical phenomena and associated physical models that influence the corium relocation. Firstly, the molten pool heat exchange with surrounding structures is analysed since it influences directly the instant of rupture of the dedicated tubes favouring the corium relocation for mitigation purpose. After the corium penetration into mitigation tubes, the fuel-coolant interactions result in formation of debris bed. Analyses of debris bed fluidization as well as sinking into a fluid are presented in this paper.

Keywords: corium, mitigation tubes, SIMMER-III, sodium fast reactor

Procedia PDF Downloads 388
10445 Evaluation of the Effect of Milk Recording Intervals on the Accuracy of an Empirical Model Fitted to Dairy Sheep Lactations

Authors: L. Guevara, Glória L. S., Corea E. E, A. Ramírez-Zamora M., Salinas-Martinez J. A., Angeles-Hernandez J. C.

Abstract:

Mathematical models are useful for identifying the characteristics of sheep lactation curves to develop and implement improved strategies. However, the accuracy of these models is influenced by factors such as the recording regime, mainly the intervals between test day records (TDR). The current study aimed to evaluate the effect of different TDR intervals on the goodness of fit of the Wood model (WM) applied to dairy sheep lactations. A total of 4,494 weekly TDRs from 156 lactations of dairy crossbred sheep were analyzed. Three new databases were generated from the original weekly TDR data (7D), comprising intervals of 14(14D), 21(21D), and 28(28D) days. The parameters of WM were estimated using the “minpack.lm” package in the R software. The shape of the lactation curve (typical and atypical) was defined based on the WM parameters. The goodness of fit was evaluated using the mean square of prediction error (MSPE), Root of MSPE (RMSPE), Akaike´s Information Criterion (AIC), Bayesian´s Information Criterion (BIC), and the coefficient of correlation (r) between the actual and estimated total milk yield (TMY). WM showed an adequate estimate of TMY regardless of the TDR interval (P=0.21) and shape of the lactation curve (P=0.42). However, we found higher values of r for typical curves compared to atypical curves (0.9vs.0.74), with the highest values for the 28D interval (r=0.95). In the same way, we observed an overestimated peak yield (0.92vs.6.6 l) and underestimated time of peak yield (21.5vs.1.46) in atypical curves. The best values of RMSPE were observed for the 28D interval in both lactation curve shapes. The significant lowest values of AIC (P=0.001) and BIC (P=0.001) were shown by the 7D interval for typical and atypical curves. These results represent the first approach to define the adequate interval to record the regime of dairy sheep in Latin America and showed a better fitting for the Wood model using a 7D interval. However, it is possible to obtain good estimates of TMY using a 28D interval, which reduces the sampling frequency and would save additional costs to dairy sheep producers.

Keywords: gamma incomplete, ewes, shape curves, modeling

Procedia PDF Downloads 78
10444 Localized Recharge Modeling of a Coastal Aquifer from a Dam Reservoir (Korba, Tunisia)

Authors: Nejmeddine Ouhichi, Fethi Lachaal, Radhouane Hamdi, Olivier Grunberger

Abstract:

Located in Cap Bon peninsula (Tunisia), the Lebna dam was built in 1987 to balance local water salt intrusion taking place in the coastal aquifer of Korba. The first intention was to reduce coastal groundwater over-pumping by supplying surface water to a large irrigation system. The unpredicted beneficial effect was recorded with the occurrence of a direct localized recharge to the coastal aquifer by leakage through the geological material of the southern bank of the lake. The hydrological balance of the reservoir dam gave an estimation of the annual leakage volume, but dynamic processes and sound quantification of recharge inputs are still required to understand the localized effect of the recharge in terms of piezometry and quality. Present work focused on simulating the recharge process to confirm the hypothesis, and established a sound quantification of the water supply to the coastal aquifer and extend it to multi-annual effects. A spatial frame of 30km² was used for modeling. Intensive outcrops and geophysical surveys based on 68 electrical resistivity soundings were used to characterize the aquifer 3D geometry and the limit of the Plio-quaternary geological material concerned by the underground flow paths. Permeabilities were determined using 17 pumping tests on wells and piezometers. Six seasonal piezometric surveys on 71 wells around southern reservoir dam banks were performed during the 2019-2021 period. Eight monitoring boreholes of high frequency (15min) piezometric data were used to examine dynamical aspects. Model boundary conditions were specified using the geophysics interpretations coupled with the piezometric maps. The dam-groundwater flow model was performed using Visual MODFLOW software. Firstly, permanent state calibration based on the first piezometric map of February 2019 was established to estimate the permanent flow related to the different reservoir levels. Secondly, piezometric data for the 2019-2021 period were used for transient state calibration and to confirm the robustness of the model. Preliminary results confirmed the temporal link between the reservoir level and the localized recharge flow with a strong threshold effect for levels below 16 m.a.s.l. The good agreement of computed flow through recharge cells on the southern banks and hydrological budget of the reservoir open the path to future simulation scenarios of the dilution plume imposed by the localized recharge. The dam reservoir-groundwater flow-model simulation results approve a potential for storage of up to 17mm/year in existing wells, under gravity-feed conditions during level increases on the reservoir into the three years of operation. The Lebna dam groundwater flow model characterized a spatiotemporal relation between groundwater and surface water.

Keywords: leakage, MODFLOW, saltwater intrusion, surface water-groundwater interaction

Procedia PDF Downloads 138
10443 Baseline Study on Human Trafficking Crimes: A Case Study of Mapping Human Trafficking Crimes in East Java Province, Indonesia

Authors: Ni Komang Desy Arya Pinatih

Abstract:

Transnational crime is a crime with 'unique' feature because the activities benefit the lack of state monitoring on the borders so dealing with it cannot be based on conventional engagement but also need joint operation with other countries. On the other hand with the flow of globalization and the growth of information technology and transportation, states become more vulnerable to transnational crime threats especially human trafficking. This paper would examine transnational crime activities, especially human trafficking in Indonesia. With the case study on the mapping of human trafficking crime in East Java province, Indonesia, this paper would try to analyze how the difference in human trafficking crime trends at the national and sub-national levels. The findings of this research were first, there is difference in human trafficking crime trends whereas at the national level the trend is rising, while at sub-national (province) level the trend is declining. Second, regarding the decline of human trafficking number, it’s interesting to see how the method to decrease human trafficking crime in East Jawa Province in order to reduce transnational crime accounts in the region. These things are hopefully becoming a model for transnational crimes engagement in other regions to reduce human trafficking numbers as much as possible.

Keywords: transnational crime, human trafficking, southeast Asia, anticipation model on transnational crimes

Procedia PDF Downloads 304
10442 Effectiveness of Using Multiple Non-pharmacological Interventions to Prevent Delirium in the Hospitalized Elderly

Authors: Yi Shan Cheng, Ya Hui Yeh, Hsiao Wen Hsu

Abstract:

Delirium is an acute state of confusion, which is mainly the result of the interaction of many factors, including: age>65 years, comorbidity, cognitive function and visual/auditory impairment, dehydration, pain, sleep disorder, pipeline retention, general anesthesia and major surgery… etc. Researches show the prevalence of delirium in hospitalized elderly patients over 50%. If it doesn't improve in time, may cause cognitive decline or impairment, not only prolong the length of hospital stay but also increase mortality. Some studies have shown that multiple nonpharmacological interventions are the most effective and common strategies, which are reorientation, early mobility, promoting sleep and nutritional support (including water intake), could improve or prevent delirium in the hospitalized elderly. In Taiwan, only one research to compare the delirium incidence of the older patients who have received orthopedic surgery between multi-nonpharmacological interventions and general routine care. Therefore, the purpose of this study is to address the prevention or improvement of delirium incidence density in medical hospitalized elderly, provide clinical nurses as a reference for clinical implementation, and develop follow-up related research. This study is a quasi-experimental design using purposive sampling. Samples are from two wards: the geriatric ward and the general medicine ward at a medical center in central Taiwan. The sample size estimated at least 100, and then the data will be collected through a self-administered structured questionnaire, including: demographic and professional evaluation items. Case recruiting from 5/13/2023. The research results will be analyzed by SPSS for Windows 22.0 software, including descriptive statistics and inferential statistics: logistic regression、Generalized Estimating Equation(GEE)、multivariate analysis of variance(MANOVA).

Keywords: multiple nonpharmacological interventions, hospitalized elderly, delirium incidence, delirium

Procedia PDF Downloads 78
10441 Characterization of Double Shockley Stacking Fault in 4H-SiC Epilayer

Authors: Zhe Li, Tao Ju, Liguo Zhang, Zehong Zhang, Baoshun Zhang

Abstract:

In-grow stacking-faults (IGSFs) in 4H-SiC epilayers can cause increased leakage current and reduce the blocking voltage of 4H-SiC power devices. Double Shockley stacking fault (2SSF) is a common type of IGSF with double slips on the basal planes. In this study, a 2SSF in the 4H-SiC epilayer grown by chemical vaper deposition (CVD) is characterized. The nucleation site of the 2SSF is discussed, and a model for the 2SSF nucleation is proposed. Homo-epitaxial 4H-SiC is grown on a commercial 4 degrees off-cut substrate by a home-built hot-wall CVD. Defect-selected-etching (DSE) is conducted with melted KOH at 500 degrees Celsius for 1-2 min. Room temperature cathodoluminescence (CL) is conducted at a 20 kV acceleration voltage. Low-temperature photoluminescence (LTPL) is conducted at 3.6 K with the 325 nm He-Cd laser line. In the CL image, a triangular area with bright contrast is observed. Two partial dislocations (PDs) with a 20-degree angle in between show linear dark contrast on the edges of the IGSF. CL and LTPL spectrums are conducted to verify the IGSF’s type. The CL spectrum shows the maximum photoemission at 2.431 eV and negligible bandgap emission. In the LTPL spectrum, four phonon replicas are found at 2.468 eV, 2.438 eV, 2.420 eV and 2.410 eV, respectively. The Egx is estimated to be 2.512 eV. A shoulder with a red-shift to the main peak in CL, and a slight protrude at the same wavelength in LTPL are verified as the so called Egx- lines. Based on the CL and LTPL results, the IGSF is identified as a 2SSF. Back etching by neutral loop discharge and DSE are conducted to track the origin of the 2SSF, and the nucleation site is found to be a threading screw dislocation (TSD) in this sample. A nucleation mechanism model is proposed for the formation of the 2SSF. Steps introduced by the off-cut and the TSD on the surface are both suggested to be two C-Si bilayers height. The intersections of such two types of steps are along [11-20] direction from the TSD, while a four-bilayer step at each intersection. The nucleation of the 2SSF in the growth is proposed as follows. Firstly, the upper two bilayers of the four-bilayer step grow down and block the lower two at one intersection, and an IGSF is generated. Secondly, the step-flow grows over the IGSF successively, and forms an AC/ABCABC/BA/BC stacking sequence. Then a 2SSF is formed and extends by the step-flow growth. In conclusion, a triangular IGSF is characterized by CL approach. Base on the CL and LTPL spectrums, the estimated Egx is 2.512 eV and the IGSF is identified to be a 2SSF. By back etching, the 2SSF nucleation site is found to be a TSD. A model for the 2SSF nucleation from an intersection of off-cut- and TSD- introduced steps is proposed.

Keywords: cathodoluminescence, defect-selected-etching, double Shockley stacking fault, low-temperature photoluminescence, nucleation model, silicon carbide

Procedia PDF Downloads 316
10440 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 151
10439 Creating a Dementia-Friendly Community

Authors: Annika Kjallman Alm, Ove Hellzen, Malin Rising-Homlstrom

Abstract:

The concept of dementia‐friendly communities focuses on the lived experience of people who have dementia and is most relevant to addressing their needs and the needs of those people who live with and provide support for them. The goal of communities becoming dementia‐friendly is for dementia to be normalized and recognized as a disabling condition. People with dementia find being connected to self, to others, and to the environment by meaningful activities as important. According to the concept underlying dementia-friendly communities, people with dementia or cognitive decline can continue to live in the community if their residential community has sufficiently strong social capital. The aim of this study is to explore staff and leaders’ experiences in implementing interventions to enhance a more inclusive dementia-friendly community. A municipality in northern Sweden with a population of approx. 100 000 inhabitants decided to create a dementia friendly municipality. As part of the initiative, a Centre for support was established. The Centre offered support for both individuals and groups, did home visits, and provided information about Dementia. Interviews were conducted with staff who had undergone training in a structured form of multidimensional support, the PER-model®, and worked at the Centre for support. The staff consisted of registered nurses, occupational therapists, and specialized nurses who had worked there for more than five years, and all had training in dementia. All interviews were audio-recorded and transcribed verbatim. The transcribed data were analyzed using qualitative content analysis. Results suggest that implementing the PER-model® of support for persons in the early stages of dementia and their next of kin added a much-needed form of support and perceived possibilities to enhance daily life in the early stages of dementia. The staff appreciated that the structure of PER-model® was evidenced based. They also realized that they never even considered that the person with dementia also needed support in the early stages but that they now had tools for that as well. Creating a dementia friendly municipality offering different kinds of support for all stages of dementia is a challenge. However, evidence-based tools and a broad spectrum of different types of support, whether individual or group, are needed to tailor to everyone’s needs. A conviction that all citizens are equal and should all be involved in the community is a strong motivator.

Keywords: dementia, dementia-friendly, municipality, support

Procedia PDF Downloads 178
10438 Refinement of Existing Benzthiazole lead Targeting Lysine Aminotransferase in Dormant Stage of Mycobacterium tuberculosis

Authors: R. Reshma srilakshmi, S. Shalini, P. Yogeeswari, D. Sriram

Abstract:

Lysine aminotransferase is a crucial enzyme for dormancy in M. tuberculosis. It is involved in persistence and antibiotic resistance. In present work, we attempted to develop benzthiazole derivatives as lysine aminotransferase inhibitors. In our attempts, we also unexpectedly arrived at an interesting compound 21 (E)-4-(5-(2-(benzo[d]thiazol-2-yl)-2-cyanovinyl)thiophen-2-yl)benzoic acid which even though has moderate activity against persistent phase of mycobacterium, it has significant potency against active phase. In the entire series compound 22 (E)-4-(5-(2-(benzo[d]thiazol-2-yl)-2-cyanovinyl)thiophen-2-yl)isophthalic acid emerged as potent molecule with LAT IC50 of 2.62 µM. It has a significant log reduction of 2.9 and 2.3 fold against nutrient starved and biofilm forming mycobacteria. It was found to be inactive in MABA assay and M.marinum induced zebra fish model. It is also devoid of cytotoxicity. Compound 22 was also found to possess bactericidal effect which is independent of concentration and time. It was found to be effective in combination with Rifampicin in 3D granuloma model. The results are very encouraging as the hit molecule shows activity against active as well as persistent forms of tuberculosis. The identified hit needs further more pharmacokinetic and dynamic screening for development as new drug candidate.

Keywords: benzothiazole, latent tuberculosis, LAT, nutrient starvation

Procedia PDF Downloads 330
10437 The Use of a Miniature Bioreactor as Research Tool for Biotechnology Process Development

Authors: Muhammad Zainuddin Arriafdi, Hamudah Hakimah Abdullah, Mohd Helmi Sani, Wan Azlina Ahmad, Muhd Nazrul Hisham Zainal Alam

Abstract:

The biotechnology process development demands numerous experimental works. In laboratory environment, this is typically carried out using a shake flask platform. This paper presents the design and fabrication of a miniature bioreactor system as an alternative research tool for bioprocessing. The working volume of the reactor is 100 ml, and it is made of plastic. The main features of the reactor included stirring control, temperature control via the electrical heater, aeration strategy through a miniature air compressor, and online optical cell density (OD) sensing. All sensors and actuators integrated into the reactor was controlled using an Arduino microcontroller platform. In order to demonstrate the functionality of such miniature bioreactor concept, series of batch Saccharomyces cerevisiae fermentation experiments were performed under various glucose concentrations. Results attained from the fermentation experiments were utilized to solve the Monod equation constants, namely the saturation constant, Ks, and cells maximum growth rate, μmax as to further highlight the usefulness of the device. The mixing capacity of the reactor was also evaluated. It was found that the results attained from the miniature bioreactor prototype were comparable to results achieved using a shake flask. The unique features of the device as compared to shake flask platform is that the reactor mixing condition is much more comparable to a lab-scale bioreactor setup. The prototype is also integrated with an online OD sensor, and as such, no sampling was needed to monitor the progress of the reaction performed. Operating cost and medium consumption are also low and thus, making it much more economical to be utilized for biotechnology process development compared to lab-scale bioreactors.

Keywords: biotechnology, miniature bioreactor, research tools, Saccharomyces cerevisiae

Procedia PDF Downloads 117
10436 Optimizing Detection Methods for THz Bio-imaging Applications

Authors: C. Bolakis, I. S. Karanasiou, D. Grbovic, G. Karunasiri, N. Uzunoglu

Abstract:

A new approach for efficient detection of THz radiation in biomedical imaging applications is proposed. A double-layered absorber consisting of a 32 nm thick aluminum (Al) metallic layer, located on a glass medium (SiO2) of 1 mm thickness, was fabricated and used to design a fine-tuned absorber through a theoretical and finite element modeling process. The results indicate that the proposed low-cost, double-layered absorber can be tuned based on the metal layer sheet resistance and the thickness of various glass media taking advantage of the diversity of the absorption of the metal films in the desired THz domain (6 to 10 THz). It was found that the composite absorber could absorb up to 86% (a percentage exceeding the 50%, previously shown to be the highest achievable when using single thin metal layer) and reflect less than 1% of the incident THz power. This approach will enable monitoring of the transmission coefficient (THz transmission ‘’fingerprint’’) of the biosample with high accuracy, while also making the proposed double-layered absorber a good candidate for a microbolometer pixel’s active element. Based on the aforementioned promising results, a more sophisticated and effective double-layered absorber is under development. The glass medium has been substituted by diluted poly-si and the results were twofold: An absorption factor of 96% was reached and high TCR properties acquired. In addition, a generalization of these results and properties over the active frequency spectrum was achieved. Specifically, through the development of a theoretical equation having as input any arbitrary frequency in the IR spectrum (0.3 to 405.4 THz) and as output the appropriate thickness of the poly-si medium, the double-layered absorber retains the ability to absorb the 96% and reflects less than 1% of the incident power. As a result, through that post-optimization process and the spread spectrum frequency adjustment, the microbolometer detector efficiency could be further improved.

Keywords: bio-imaging, fine-tuned absorber, fingerprint, microbolometer

Procedia PDF Downloads 348
10435 A Hybrid Algorithm for Collaborative Transportation Planning among Carriers

Authors: Elham Jelodari Mamaghani, Christian Prins, Haoxun Chen

Abstract:

In this paper, there is concentration on collaborative transportation planning (CTP) among multiple carriers with pickup and delivery requests and time windows. This problem is a vehicle routing problem with constraints from standard vehicle routing problems and new constraints from a real-world application. In the problem, each carrier has a finite number of vehicles, and each request is a pickup and delivery request with time window. Moreover, each carrier has reserved requests, which must be served by itself, whereas its exchangeable requests can be outsourced to and served by other carriers. This collaboration among carriers can help them to reduce total transportation costs. A mixed integer programming model is proposed to the problem. To solve the model, a hybrid algorithm that combines Genetic Algorithm and Simulated Annealing (GASA) is proposed. This algorithm takes advantages of GASA at the same time. After tuning the parameters of the algorithm with the Taguchi method, the experiments are conducted and experimental results are provided for the hybrid algorithm. The results are compared with those obtained by a commercial solver. The comparison indicates that the GASA significantly outperforms the commercial solver.

Keywords: centralized collaborative transportation, collaborative transportation with pickup and delivery, collaborative transportation with time windows, hybrid algorithm of GA and SA

Procedia PDF Downloads 392
10434 Detecting Port Maritime Communities in Spain with Complex Network Analysis

Authors: Nicanor Garcia Alvarez, Belarmino Adenso-Diaz, Laura Calzada Infante

Abstract:

In recent years, researchers have shown an interest in modelling maritime traffic as a complex network. In this paper, we propose a bipartite weighted network to model maritime traffic and detect port maritime communities. The bipartite weighted network considers two different types of nodes. The first one represents Spanish ports, while the second one represents the countries with which there is major import/export activity. The flow among both types of nodes is modeled by weighting the volume of product transported. To illustrate the model, the data is segmented by each type of traffic. This will allow fine tuning and the creation of communities for each type of traffic and therefore finding similar ports for a specific type of traffic, which will provide decision-makers with tools to search for alliances or identify their competitors. The traffic with the greatest impact on the Spanish gross domestic product is selected, and the evolution of the communities formed by the most important ports and their differences between 2019 and 2009 will be analyzed. Finally, the set of communities formed by the ports of the Spanish port system will be inspected to determine global similarities between them, analyzing the sum of the membership of the different ports in communities formed for each type of traffic in particular.

Keywords: bipartite networks, competition, infomap, maritime traffic, port communities

Procedia PDF Downloads 148
10433 Numinous Luminosity: A Mixed Methods Study of Mystical Light Experiences

Authors: J. R. Dinsmore, R. W. Hood

Abstract:

Experiences of a divine or mystical light are frequently reported in religious/spiritual experiences today, most notably in the context of mystical and near-death experiences. Light of a transcendental nature and its experiences of it are also widely present and highly valued in many religious and mystical traditions. Despite the significance of this luminosity to the topic of religious experience, efforts to study the phenomenon empirically have been minimal and scattered. This mixed methods study developed and validated a questionnaire for the measurement of numinous luminosity experience and investigated the dimensions and effects of this novel construct using both quantitative and qualitative methodologies. A sequential explanatory design (participant selection model) was used, which involved a scale development phase, followed by a correlational study testing hypotheses about its effects on beliefs and well-being derived from the literature, and lastly, a phenomenological study of a sample selected from the correlational phase results. The outcomes of the study are a unified theoretical model of numinous luminosity experience across multiple experiential contexts, initial correlational findings regarding the possible mechanism of its reported positive transformational effects, and a valid and reliable instrument for its further empirical study.

Keywords: religious experience, mystical experience, near-death experience, scale development, questionnaire, divine light, mystical light, mystical luminosity

Procedia PDF Downloads 95
10432 Numerical Predictions of Trajectory Stability of a High-Speed Water-Entry and Water-Exit Projectile

Authors: Lin Lu, Qiang Li, Tao Cai, Pengjun Zhang

Abstract:

In this study, a detailed analysis of trajectory stability and flow characteristics of a high-speed projectile during the water-entry and water-exit process has been investigated numerically. The Zwart-Gerber-Belamri (Z-G-B) cavitation model and the SST k-ω turbulence model based on the Reynolds Averaged Navier-Stokes (RANS) method are employed. The numerical methodology is validated by comparing the experimental photograph of cavitation shape and the experimental underwater velocity with the numerical simulation results. Based on the numerical methodology, the influences of rotational speed, water-entry and water-exit angle of the projectile on the trajectory stability and flow characteristics have been carried out in detail. The variation features of projectile trajectory and total resistance have been conducted, respectively. In addition, the cavitation characteristics of water-entry and water-exit have been presented and analyzed. Results show that it may not be applicable for the water-entry and water-exit to achieve the projectile stability through the rotation of projectile. Furthermore, there ought to be a critical water-entry angle for the water-entry stability of practical projectile. The impact of water-exit angle on the trajectory stability and cavity phenomenon is not as remarkable as that of the water-entry angle.

Keywords: cavitation characteristics, high-speed projectile, numerical predictions, trajectory stability, water-entry, water-exit

Procedia PDF Downloads 136
10431 The Reliability and Shape of the Force-Power-Velocity Relationship of Strength-Trained Males Using an Instrumented Leg Press Machine

Authors: Mark Ashton Newman, Richard Blagrove, Jonathan Folland

Abstract:

The force-velocity profile of an individual has been shown to influence success in ballistic movements, independent of the individuals' maximal power output; therefore, effective and accurate evaluation of an individual’s F-V characteristics and not solely maximal power output is important. The relatively narrow range of loads typically utilised during force-velocity profiling protocols due to the difficulty in obtaining force data at high velocities may bring into question the accuracy of the F-V slope along with predictions pertaining to the maximum force that the system can produce at a velocity of null (F₀) and the theoretical maximum velocity against no load (V₀). As such, the reliability of the slope of the force-velocity profile, as well as V₀, has been shown to be relatively poor in comparison to F₀ and maximal power, and it has been recommended to assess velocity at loads closer to both F₀ and V₀. The aim of the present study was to assess the relative and absolute reliability of an instrumented novel leg press machine which enables the assessment of force and velocity data at loads equivalent to ≤ 10% of one repetition maximum (1RM) through to 1RM during a ballistic leg press movement. The reliability of maximal and mean force, velocity, and power, as well as the respective force-velocity and power-velocity relationships and the linearity of the force-velocity relationship, were evaluated. Sixteen male strength-trained individuals (23.6 ± 4.1 years; 177.1 ± 7.0 cm; 80.0 ± 10.8 kg) attended four sessions; during the initial visit, participants were familiarised with the leg press, modified to include a mounted force plate (Type SP3949, Force Logic, Berkshire, UK) and a Micro-Epsilon WDS-2500-P96 linear positional transducer (LPT) (Micro-Epsilon, Merseyside, UK). Peak isometric force (IsoMax) and a dynamic 1RM, both from a starting position of 81% leg length, were recorded for the dominant leg. Visits two to four saw the participants carry out the leg press movement at loads equivalent to ≤ 10%, 30%, 50%, 70%, and 90% 1RM. IsoMax was recorded during each testing visit prior to dynamic F-V profiling repetitions. The novel leg press machine used in the present study appears to be a reliable tool for measuring F and V-related variables across a range of loads, including velocities closer to V₀ when compared to some of the findings within the published literature. Both linear and polynomial models demonstrated good to excellent levels of reliability for SFV and F₀ respectively, with reliability for V₀ being good using a linear model but poor using a 2nd order polynomial model. As such, a polynomial regression model may be most appropriate when using a similar unilateral leg press setup to predict maximal force production capabilities due to only a 5% difference between F₀ and obtained IsoMax values with a linear model being best suited to predict V₀.

Keywords: force-velocity, leg-press, power-velocity, profiling, reliability

Procedia PDF Downloads 58
10430 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method

Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang

Abstract:

Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.

Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series

Procedia PDF Downloads 274
10429 E-Governance: A Key for Improved Public Service Delivery

Authors: Ayesha Akbar

Abstract:

Public service delivery has witnessed a significant improvement with the integration of information communication technology (ICT). It not only improves management structure with advanced technology for surveillance of service delivery but also provides evidence for informed decisions and policy. Pakistan’s public sector organizations have not been able to produce some good results to ensure service delivery. Notwithstanding, some of the public sector organizations in Pakistan has diffused modern technology and proved their credence by providing better service delivery standards. These good indicators provide sound basis to integrate technology in public sector organizations and shift of policy towards evidence based policy making. Rescue-1122 is a public sector organization which provides emergency services and proved to be a successful model for the provision of service delivery to save human lives and to ensure human development in Pakistan. The information about the organization has been received by employing qualitative research methodology. The information is broadly based on primary and secondary sources which includes Rescue-1122 website, official reports of organizations; UNDP (United Nation Development Program), WHO (World Health Organization) and by conducting 10 in-depth interviews with the high administrative staff of organizations who work in the Lahore offices. The information received has been incorporated with the study for the better understanding of the organization and their management procedures. Rescue-1122 represents a successful model in delivering the services in an efficient way to deal with the disaster management. The management of Rescue has strategized the policies and procedures in such a way to develop a comprehensive model with the integration of technology. This model provides efficient service delivery as well as maintains the standards of the organization. The service delivery model of rescue-1122 works on two fronts; front-office interface and the back-office interface. Back-office defines the procedures of operations and assures the compliance of the staff whereas, front-office equipped with the latest technology and good infrastructure handles the emergency calls. Both ends are integrated with satellite based vehicle tracking, wireless system, fleet monitoring system and IP camera which monitors every move of the staff to provide better services and to pinpoint the distortions in the services. The standard time of reaching to the emergency spot is 7 minutes, and during entertaining the case; driver‘s behavior, traffic volume and the technical assistance being provided to the emergency case is being monitored by front-office. Then the whole information get uploaded to the main dashboard of Lahore headquarter from the provincial offices. The latest technology is being materialized by Rescue-1122 for delivering the efficient services, investigating the flaws; if found, and to develop data to make informed decision making. The other public sector organizations of Pakistan can also develop such models to integrate technology for improving service delivery and to develop evidence for informed decisions and policy making.

Keywords: data, e-governance, evidence, policy

Procedia PDF Downloads 247
10428 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution

Procedia PDF Downloads 355
10427 Finite Element Modeling of Ultrasonic Shot Peening Process using Multiple Pin Impacts

Authors: Chao-xun Liu, Shi-hong Lu

Abstract:

In spite of its importance to the aerospace and automobile industries, little or no attention has been devoted to the accurate modeling of the ultrasonic shot peening (USP) process. It is therefore the purpose of this study to conduct finite element analysis of the process using a realistic multiple pin impacts model with the explicit solver of ABAQUS. In this paper, we research the effect of several key parameters on the residual stress distribution within the target, including impact velocity, incident angle, friction coefficient between pins and target and impact number of times were investigated. The results reveal that the impact velocity and impact number of times have obvious effect and impacting vertically could produce the most perfect residual stress distribution. Then we compare the results with the date in USP experiment and verify the exactness of the model. The analysis of the multiple pin impacts date reveal the relationships between peening process parameters and peening quality, which are useful for identifying the parameters which need to be controlled and regulated in order to produce a more beneficial compressive residual stress distribution within the target.

Keywords: ultrasonic shot peening, finite element, multiple pins, residual stress, numerical simulation

Procedia PDF Downloads 448
10426 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 199
10425 Surgical Planning for the Removal of Cranial Spheno-orbital Meningioma by Using Personalized Polymeric Prototypes Obtained with Additive Manufacturing Techniques

Authors: Freddy Patricio Moncayo-Matute, Pablo Gerardo Peña-Tapia, Vázquez-Silva Efrén, Paúl Bolívar Torres-Jara, Diana Patricia Moya-Loaiza, Gabriela Abad-Farfán

Abstract:

This study describes a clinical case and the results on the application of additive manufacturing for the surgical planning in the removal of a cranial spheno-orbital meningioma. It is verified that the use of personalized anatomical models and cutting guides helps to manage the cranial anomalies approach. The application of additive manufacturing technology: Fused Deposition Modeling (FDM), as a low-cost alternative, enables the printing of the test anatomical model, which in turn favors the reduction of surgery time, as well the morbidity rate reduction too. And the printing of the personalized cutting guide, which constitutes a valuable aid to the surgeon in terms of improving the intervention precision and reducing the invasive effect during the craniotomy. As part of the results, post-surgical follow-up is included as an instrument to verify the patient's recovery and the validity of the procedure.

Keywords: surgical planning, additive manufacturing, rapid prototyping, fused deposition modeling, custom anatomical model

Procedia PDF Downloads 100
10424 Assessing the Impact of Climate Change on Pulses Production in Khyber Pakhtunkhwa, Pakistan

Authors: Khuram Nawaz Sadozai, Rizwan Ahmad, Munawar Raza Kazmi, Awais Habib

Abstract:

Climate change and crop production are intrinsically associated with each other. Therefore, this research study is designed to assess the impact of climate change on pulses production in Southern districts of Khyber Pakhtunkhwa (KP) Province of Pakistan. Two pulses (i.e. chickpea and mung bean) were selected for this research study with respect to climate change. Climatic variables such as temperature, humidity and precipitation along with pulses production and area under cultivation of pulses were encompassed as the major variables of this study. Secondary data of climatic variables and crop variables for the period of thirty four years (1986-2020) were obtained from Pakistan Metrological Department and Agriculture Statistics of KP respectively. Panel data set of chickpea and mung bean crops was estimated separately. The analysis validate that both data sets were a balanced panel data. The Hausman specification test was run separately for both the panel data sets whose findings had suggested the fixed effect model can be deemed as an appropriate model for chickpea panel data, however random effect model was appropriate for estimation of the panel data of mung bean. Major findings confirm that maximum temperature is statistically significant for the chickpea yield. This implies if maximum temperature increases by 1 0C, it can enhance the chickpea yield by 0.0463 units. However, the impact of precipitation was reported insignificant. Furthermore, the humidity was statistically significant and has a positive association with chickpea yield. In case of mung bean the minimum temperature was significantly contributing in the yield of mung bean. This study concludes that temperature and humidity can significantly contribute to enhance the pulses yield. It is recommended that capacity building of pulses growers may be made to adapt the climate change strategies. Moreover, government may ensure the availability of climate change resistant varieties of pulses to encourage the pulses cultivation.

Keywords: climate change, pulses productivity, agriculture, Pakistan

Procedia PDF Downloads 44
10423 Self Tuning Controller for Reducing Cycle to Cycle Variations in SI Engine

Authors: Alirıza Kaleli, M. Akif Ceviz, Erdoğan Güner, Köksal Erentürk

Abstract:

The cyclic variations in spark ignition engines occurring especially under specific engine operating conditions make the maximum pressure variable for successive in-cylinder pressure cycles. Minimization of cyclic variations has a great importance in effectively operating near to lean limit, or at low speed and load. The cyclic variations may reduce the power output of the engine, lead to operational instabilities, and result in undesirable engine vibrations and noise. In this study, spark timing is controlled in order to reduce the cyclic variations in spark ignition engines. Firstly, an ARMAX model has developed between spark timing and maximum pressure using system identification techniques. By using this model, the maximum pressure of the next cycle has been predicted. Then, self-tuning minimum variance controller has been designed to change the spark timing for consecutive cycles of the first cylinder of test engine to regulate the in-cylinder maximum pressure. The performance of the proposed controller is illustrated in real time and experimental results show that the controller has a reliable effect on cycle to cycle variations of maximum cylinder pressure when the engine works under low speed conditions.

Keywords: cyclic variations, cylinder pressure, SI engines, self tuning controller

Procedia PDF Downloads 481
10422 An Integrated Assessment (IA) of Water Resources in the Speightstown Catchment, Barbados Using a GIS-Based Decision Support System

Authors: Anuradha Maharaj, Adrian Cashman

Abstract:

The cross-cutting nature of water as a resource translates into the need for a better understanding of its movement, storage and loss at all points in the hydro-socioeconomic cycle. An integrated approach to addressing the issue of sustainability means quantitatively understanding: the linkages within this cycle, the role of water managers in resource allocation, and the critical factors influencing its scarcity. The Water Evaluation and Planning Tool (WEAP) is an integrative model that combines the catchment-scale hydrologic processes with a water management model, driven by environmental requirements and socioeconomic demands. The concept of demand priorities is included to represent the areas of greatest use within a given catchment. Located on Barbados’ West Coast, Speightstown and the surrounding areas encompass a well-developed tourist, residential and agricultural area. The main water resource for this area, and the rest of the island, is that of groundwater. The availability of groundwater in Barbados may be adversely affected by the projected changes in climate, such as reduced wet season rainfall. Economic development and changing sector priorities together with climate related changes have the potential to affect water resource abundance and by extension the allocation of resources for example in the Speightstown area. In order to investigate the potential impacts on the Speightstown area specifically, a WEAP Model of the study area was developed to estimate the present available water (baseline reference scenario 2000-2010). From this baseline scenario, it is envisioned that an exploration into projected changes in availability in the near term (2035-2045) and medium/long term (2065-2075) time frames will be undertaken. The generated estimations can assist water managers to better evaluate the status of and identify trends in water use and formulate adaptation measures to offset future deficits.

Keywords: water evaluation and planning system (WEAP), water availability, demand and supply, water allocation

Procedia PDF Downloads 351
10421 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 75
10420 Investigating a Modern Accident Analysis Model for Textile Building Fires through Numerical Reconstruction

Authors: Mohsin Ali Shaikh, Weiguo Song, Rehmat Karim, Muhammad Kashan Surahio, Muhammad Usman Shahid

Abstract:

Fire investigations face challenges due to the complexity of fire development, and real-world accidents lack repeatability, making it difficult to apply standardized approaches. The unpredictable nature of fires and the unique conditions of each incident contribute to the complexity, requiring innovative methods and tools for effective analysis and reconstruction. This study proposes to provide the modern accident analysis model through numerical reconstruction for fire investigation in textile buildings. This method employs computer simulation to enhance the overall effectiveness of textile-building investigations. The materials and evidence collected from past incidents reconstruct fire occurrences, progressions, and catastrophic processes. The approach is demonstrated through a case study involving a tragic textile factory fire in Karachi, Pakistan, which claimed 257 lives. The reconstruction method proves invaluable for determining fire origins, assessing losses, establishing accountability, and, significantly, providing preventive insights for complex fire incidents.

Keywords: fire investigation, numerical simulation, fire safety, fire incident, textile building

Procedia PDF Downloads 65