Search results for: software defined storage
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9353

Search results for: software defined storage

7313 Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

Authors: J. R. Wang, S. W. Chen, Y. Chiang, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih

Abstract:

In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the parameters (wind speed, atmospheric stability classification, air temperature, and control room intake flow rate) was also performed in this research.

Keywords: PWR, HABIT, Habitability, Maanshan

Procedia PDF Downloads 445
7312 Using HABIT to Estimate the Concentration of CO2 and H2SO4 for Kuosheng Nuclear Power Plant

Authors: Y. Chiang, W. Y. Li, J. R. Wang, S. W. Chen, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih

Abstract:

In this research, the HABIT code was used to estimate the concentration under the CO2 and H2SO4 storage burst conditions for Kuosheng nuclear power plant (NPP). The Final Safety Analysis Report (FSAR) and reports were used in this research. In addition, to evaluate the control room habitability for these cases, the HABIT analysis results were compared with the R.G. 1.78 failure criteria. The comparison results show that the HABIT results are below the criteria. Additionally, some sensitivity studies (stability classification, wind speed and control room intake rate) were performed in this study.

Keywords: BWR, HABIT, habitability, Kuosheng

Procedia PDF Downloads 489
7311 Lies and Pretended Fairness of Police Officers in Sharing

Authors: Eitan Elaad

Abstract:

The current study aimed to examine lying and pretended fairness by police personnel in sharing situations. Forty Israeli police officers and 40 laypeople from the community, all males, self-assessed their lie-telling ability, rated the frequency of their lies, evaluated the acceptability of lying, and indicated using rational and intuitive thinking while lying. Next, according to the ultimatum game procedure, participants were asked to share 100 points with an imagined target, either a male policeman or a male non-policeman. Participants allocated points to the target person bearing in mind that the other person must accept or reject their offer. Participants' goal was to retain as many points as possible, and to this end, they could tell the target person that fewer than 100 points were available for distribution. We defined concealment or lying as the difference between the available 100 points and the sum of points designated for sharing. Results indicated that police officers lied less to their fellow police targets than non-police targets, whereas laypeople lied less to non-police targets than imagined police targets. The ratio between the points offered to the imagined target person and the points endowed by the participant as available for sharing defined pretended fairness.Enhanced pretended fairness indicates higher motivation to display fair sharing even if the fair sharing is fictitious. Police officers presented higher pretended fairness to police targets than laypeople, whereas laypeople set off more fairness to non-police targets than police officers. We discussed the results concerning occupation solidarity and loyalty among police personnel. Specifically, police work involves uncertainty, danger and risk, coercive authority, and the use of force, which isolates the police from the community and dictates strong bonds of solidarity between police personnel. No wonder police officers shared more points (lied less) to fellow police targets than non-police targets. On the other hand, police legitimacy or the belief that the police are acting honestly in the best interest of the citizens constitutes citizens' attitudes toward the police. The relatively low number of points shared for distribution by laypeople to police targets indicates difficulties with the legitimacy of the Israeli police.

Keywords: lying, fairness, police solidarity, police legitimacy, sharing, ultimatum game

Procedia PDF Downloads 114
7310 Non-Destructive Test of Bar for Determination of Critical Compression Force Directed towards the Pole

Authors: Boris Blostotsky, Elia Efraim

Abstract:

The phenomenon of buckling of structural elements under compression is revealed in many cases of loading and found consideration in many structures and mechanisms. In the present work the method and results of dynamic test for buckling of bar loaded by a compression force directed towards the pole are considered. Experimental determination of critical force for such system has not been made previously. The tested object is a bar with semi-rigid connection to the base at one of its ends, and with a hinge moving along a circle at the other. The test includes measuring the natural frequency of the bar at different values of compression load. The lateral stiffness is calculated based on natural frequency and reduced mass on the bar's movable end. The critical load is determined by extrapolation the values of lateral stiffness up to zero value. For the experimental investigation the special test-bed was created that allows the stability testing at positive and negative curvature of the movable end's trajectory, as well as varying the rotational stiffness of the other end connection. Decreasing a friction at the movable end allows extend the diapason of applied compression force. The testing method includes: - Methodology of the experiment planning, that allows determine the required number of tests under various loads values in the defined range and the type of extrapolating function; - Methodology of experimental determination of reduced mass at the bar's movable end including its own mass; - Methodology of experimental determination of lateral stiffness of uncompressed bar rotational semi-rigid connection at the base. For planning the experiment and for comparison of the experimental results with the theoretical values of critical load, the analytical dependencies of lateral stiffness of the bar with defined end conditions on compression load. In the particular case of perfectly rigid connection of the bar to the base, the critical load value corresponds to solution by S.P. Timoshenko. Correspondence of the calculated and experimental values was obtained.

Keywords: non-destructive test, buckling, dynamic method, semi-rigid connections

Procedia PDF Downloads 355
7309 Assessment and Mitigation of Slope Stability Hazards Along Kombolcha-Desse Road, Northern Ethiopia

Authors: Biruk Wolde Eremacho

Abstract:

The Kombolcha to Desse road, linking Addis Ababa with Northern Ethiopia towns traverses through one of the most difficult mountainous ranges in Ethiopia. The presence of loose unconsolidated materials (colluvium materials), highly weathered and fractured basalt rocks high relief, steep natural slopes, nature of geologic formations exposed along the road section, poor drainage conditions, occurrence of high seasonal rains, and seismically active nature of the region created favorable condition for slope instability in the area. Thus, keeping in mind all above points the present study was conceived to study in detail the slope stability condition of the area. It was realized that detailed slope stability studies along this road section are very necessary to identify critical slopes and to provide the best remedial measures to minimize the slope instability problems which frequently disrupt and endanger the traffic movement on this important road. For the present study based on the field manifestation of instability two most critical slope sections were identified for detailed slope stability analysis. The deterministic slope stability analysis approach was followed to perform the detailed slope stability analysis of the selected slope sections. Factor of safety for the selected slope sections was determined for the different anticipated conditions (i.e., static and dynamic with varied water saturations) using Slope/W and Slide software. Both static and seismic slope stability analysis were carried out and factor of safety was deduced for each anticipated conditions. In general, detailed slope stability analysis of the two critical slope sections reveals that for only static dry condition both the slopes sections would be stable. However, for the rest anticipated conditions defined by static and dynamic situations with varied water saturations both critical slope sections would be unstable. Moreover, the causes of slope instability in the study area are governed by different factors; therefore integrated approaches of remedial measures are more appropriate to mitigate the possible slope instability in the study area. Depending on site condition and slope stability analysis result four types of suitable preventive and remedial measures are recommended namely; proper managements of drainages, retaining structures, gabions, and managing steeply cut slopes.

Keywords: factor of safety, remedial measures, slope stability analysis, static and dynamic condition

Procedia PDF Downloads 279
7308 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 145
7307 Characterization of Volatiles Botrytis cinerea in Blueberry Using Solid Phase Micro Extraction, Gas Chromatography Mass Spectrometry

Authors: Ahmed Auda, Manjree Agarwala, Giles Hardya, Yonglin Rena

Abstract:

Botrytis cinerea is a major pest for many plants. It can attack a wide range of plant parts. It can attack buds, flowers, and leaves, stems, and fruit. However, B. cinerea can be mixed with other diseases that cause the same damage. There are many species of botrytis and more than one different strains of each. Botrytis might infect the foliage of nursery stock stored through winter in damp conditions. There are no known resistant plants. Botrytis must have nutrients or food source before it infests the plant. Nutrients leaking from wounded plant parts or dying tissue like old flower petals give the required nutrients. From this food, the fungus becomes more attackers and invades healthy tissue. Dark to light brown rot forms in the ill tissue. High humidity conditions support the growth of this fungus. However, we suppose that selection pressure can act on the morphological and neurophysiologic filter properties of the receiver and on both the biochemical and the physiological regulation of the signal. Communication is implied when signal and receiver evolves toward more and more specific matching, culminating. In other hand, receivers respond to portions of a body odor bouquet which is released to the environment not as an (intentional) signal but as an unavoidable consequence of metabolic activity or tissue damage. Each year Botrytis species can cause considerable economic losses to plant crops. Even with the application of strict quarantine and control measures, these fungi can still find their way into crops and cause the imposition of onerous restrictions on exports. Blueberry fruit mould caused by a fungal infection usually results in major losses during post-harvest storage. Therefore, the management of infection in early stages of disease development is necessary to minimize losses. The overall purpose of this study will develop sensitive, cheap, quick and robust diagnostic techniques for the detection of B. cinerea in blueberry. The specific aim was designed to investigate the performance of volatile organic compounds (VOCs) in the detection and discrimination of blueberry fruits infected by fungal pathogens with an emphasis on Botrytis in the early storage stage of post-harvest.

Keywords: botrytis cinerea, blueberry, GC/MS, VOCs

Procedia PDF Downloads 241
7306 Domain-Specific Languages Evaluation: A Literature Review and Experience Report

Authors: Sofia Meacham

Abstract:

In this abstract paper, the Domain-Specific Languages (DSL) evaluation will be presented based on existing literature and years of experience developing DSLs for several domains. The domains we worked on ranged from AI, business applications, and finances/accounting to health. In general, DSLs have been utilised in many domains to provide tailored and efficient solutions to address specific problems. Although they are a reputable method among highly technical circles and have also been used by non-technical experts with success, according to our knowledge, there isn’t a commonly accepted method for evaluating them. There are some methods that define criteria that are adaptations from the general software engineering quality criteria. Other literature focuses on the DSL usability aspect of evaluation and applies methods such as Human-Computer Interaction (HCI) and goal modeling. All these approaches are either hard to introduce, such as the goal modeling, or seem to ignore the domain-specific focus of the DSLs. From our experience, the DSLs have domain-specificity in their core, and consequently, the methods to evaluate them should also include domain-specific criteria in their core. The domain-specific criteria would require synergy between the domain experts and the DSL developers in the same way that DSLs cannot be developed without domain-experts involvement. Methods from agile and other software engineering practices, such as co-creation workshops, should be further emphasised and explored to facilitate this direction. Concluding, our latest experience and plans for DSLs evaluation will be presented and open for discussion.

Keywords: domain-specific languages, DSL evaluation, DSL usability, DSL quality metrics

Procedia PDF Downloads 103
7305 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction

Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili

Abstract:

Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.

Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software

Procedia PDF Downloads 130
7304 Short Association Bundle Atlas for Lateralization Studies from dMRI Data

Authors: C. Román, M. Guevara, P. Salas, D. Duclap, J. Houenou, C. Poupon, J. F. Mangin, P. Guevara

Abstract:

Diffusion Magnetic Resonance Imaging (dMRI) allows the non-invasive study of human brain white matter. From diffusion data, it is possible to reconstruct fiber trajectories using tractography algorithms. Our previous work consists in an automatic method for the identification of short association bundles of the superficial white matter (SWM), based on a whole brain inter-subject hierarchical clustering applied to a HARDI database. The method finds representative clusters of similar fibers, belonging to a group of subjects, according to a distance measure between fibers, using a non-linear registration (DTI-TK). The algorithm performs an automatic labeling based on the anatomy, defined by a cortex mesh parcelated with FreeSurfer software. The clustering was applied to two independent groups of 37 subjects. The clusters resulting from both groups were compared using a restrictive threshold of mean distance between each pair of bundles from different groups, in order to keep reproducible connections. In the left hemisphere, 48 reproducible bundles were found, while 43 bundles where found in the right hemisphere. An inter-hemispheric bundle correspondence was then applied. The symmetric horizontal reflection of the right bundles was calculated, in order to obtain the position of them in the left hemisphere. Next, the intersection between similar bundles was calculated. The pairs of bundles with a fiber intersection percentage higher than 50% were considered similar. The similar bundles between both hemispheres were fused and symmetrized. We obtained 30 common bundles between hemispheres. An atlas was created with the resulting bundles and used to segment 78 new subjects from another HARDI database, using a distance threshold between 6-8 mm according to the bundle length. Finally, a laterality index was calculated based on the bundle volume. Seven bundles of the atlas presented right laterality (IP_SP_1i, LO_LO_1i, Op_Tr_0i, PoC_PoC_0i, PoC_PreC_2i, PreC_SM_0i, y RoMF_RoMF_0i) and one presented left laterality (IP_SP_2i), there is no tendency of lateralization according to the brain region. Many factors can affect the results, like tractography artifacts, subject registration, and bundle segmentation. Further studies are necessary in order to establish the influence of these factors and evaluate SWM laterality.

Keywords: dMRI, hierarchical clustering, lateralization index, tractography

Procedia PDF Downloads 331
7303 Quantum Graph Approach for Energy and Information Transfer through Networks of Cables

Authors: Mubarack Ahmed, Gabriele Gradoni, Stephen C. Creagh, Gregor Tanner

Abstract:

High-frequency cables commonly connect modern devices and sensors. Interestingly, the proportion of electric components is rising fast in an attempt to achieve lighter and greener devices. Modelling the propagation of signals through these cable networks in the presence of parameter uncertainty is a daunting task. In this work, we study the response of high-frequency cable networks using both Transmission Line and Quantum Graph (QG) theories. We have successfully compared the two theories in terms of reflection spectra using measurements on real, lossy cables. We have derived a generalisation of the vertex scattering matrix to include non-uniform networks – networks of cables with different characteristic impedances and propagation constants. The QG model implicitly takes into account the pseudo-chaotic behavior, at the vertices, of the propagating electric signal. We have successfully compared the asymptotic growth of eigenvalues of the Laplacian with the predictions of Weyl law. We investigate the nearest-neighbour level-spacing distribution of the resonances and compare our results with the predictions of Random Matrix Theory (RMT). To achieve this, we will compare our graphs with the generalisation of Wigner distribution for open systems. The problem of scattering from networks of cables can also provide an analogue model for wireless communication in highly reverberant environments. In this context, we provide a preliminary analysis of the statistics of communication capacity for communication across cable networks, whose eventual aim is to enable detailed laboratory testing of information transfer rates using software defined radio. We specialise this analysis in particular for the case of MIMO (Multiple-Input Multiple-Output) protocols. We have successfully validated our QG model with both TL model and laboratory measurements. The growth of Eigenvalues compares well with Weyl’s law and the level-spacing distribution agrees so well RMT predictions. The results we achieved in the MIMO application compares favourably with the prediction of a parallel on-going research (sponsored by NEMF21.)

Keywords: eigenvalues, multiple-input multiple-output, quantum graph, random matrix theory, transmission line

Procedia PDF Downloads 173
7302 Captives on the Frontier: An Exploration of National Identity in Argentine Literature and Art

Authors: Carlos Riobo

Abstract:

This paper analyzes literature and art in Argentina from the nineteenth to the twenty-first centuries as these media used the figure of the white female captive to define a developing national identity. This identity excluded the Indians whose lands the whites were taking and who appeared as the aggressors and captors in writing and paintings. The paper identifies the complicit relationship between art and history in crafting national memory. It also identifies a movement toward purity (as defined by separation of entities) and away from mestizaje (racial and cultural mixtures).

Keywords: Argentina, borders, captives, literature, painting

Procedia PDF Downloads 163
7301 Structural Design Optimization of Reinforced Thin-Walled Vessels under External Pressure Using Simulation and Machine Learning Classification Algorithm

Authors: Lydia Novozhilova, Vladimir Urazhdin

Abstract:

An optimization problem for reinforced thin-walled vessels under uniform external pressure is considered. The conventional approaches to optimization generally start with pre-defined geometric parameters of the vessels, and then employ analytic or numeric calculations and/or experimental testing to verify functionality, such as stability under the projected conditions. The proposed approach consists of two steps. First, the feasibility domain will be identified in the multidimensional parameter space. Every point in the feasibility domain defines a design satisfying both geometric and functional constraints. Second, an objective function defined in this domain is formulated and optimized. The broader applicability of the suggested methodology is maximized by implementing the Support Vector Machines (SVM) classification algorithm of machine learning for identification of the feasible design region. Training data for SVM classifier is obtained using the Simulation package of SOLIDWORKS®. Based on the data, the SVM algorithm produces a curvilinear boundary separating admissible and not admissible sets of design parameters with maximal margins. Then optimization of the vessel parameters in the feasibility domain is performed using the standard algorithms for the constrained optimization. As an example, optimization of a ring-stiffened closed cylindrical thin-walled vessel with semi-spherical caps under high external pressure is implemented. As a functional constraint, von Mises stress criterion is used but any other stability constraint admitting mathematical formulation can be incorporated into the proposed approach. Suggested methodology has a good potential for reducing design time for finding optimal parameters of thin-walled vessels under uniform external pressure.

Keywords: design parameters, feasibility domain, von Mises stress criterion, Support Vector Machine (SVM) classifier

Procedia PDF Downloads 327
7300 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model

Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula

Abstract:

In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.

Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service

Procedia PDF Downloads 99
7299 Level Set Based Extraction and Update of Lake Contours Using Multi-Temporal Satellite Images

Authors: Yindi Zhao, Yun Zhang, Silu Xia, Lixin Wu

Abstract:

The contours and areas of water surfaces, especially lakes, often change due to natural disasters and construction activities. It is an effective way to extract and update water contours from satellite images using image processing algorithms. However, to produce optimal water surface contours that are close to true boundaries is still a challenging task. This paper compares the performances of three different level set models, including the Chan-Vese (CV) model, the signed pressure force (SPF) model, and the region-scalable fitting (RSF) energy model for extracting lake contours. After experiment testing, it is indicated that the RSF model, in which a region-scalable fitting (RSF) energy functional is defined and incorporated into a variational level set formulation, is superior to CV and SPF, and it can get desirable contour lines when there are “holes” in the regions of waters, such as the islands in the lake. Therefore, the RSF model is applied to extracting lake contours from Landsat satellite images. Four temporal Landsat satellite images of the years of 2000, 2005, 2010, and 2014 are used in our study. All of them were acquired in May, with the same path/row (121/036) covering Xuzhou City, Jiangsu Province, China. Firstly, the near infrared (NIR) band is selected for water extraction. Image registration is conducted on NIR bands of different temporal images for information update, and linear stretching is also done in order to distinguish water from other land cover types. Then for the first temporal image acquired in 2000, lake contours are extracted via the RSF model with initialization of user-defined rectangles. Afterwards, using the lake contours extracted the previous temporal image as the initialized values, lake contours are updated for the current temporal image by means of the RSF model. Meanwhile, the changed and unchanged lakes are also detected. The results show that great changes have taken place in two lakes, i.e. Dalong Lake and Panan Lake, and RSF can actually extract and effectively update lake contours using multi-temporal satellite image.

Keywords: level set model, multi-temporal image, lake contour extraction, contour update

Procedia PDF Downloads 366
7298 Ochratoxin-A in Traditional Meat Products from Croatian Households

Authors: Jelka Pleadin, Nina Kudumija, Ana Vulic, Manuela Zadravec, Tina Lesic, Mario Skrivanko, Irena Perkovic, Nada Vahcic

Abstract:

Products of animal origin, such as meat and meat products, can contribute to human mycotoxins’ intake coming as a result of either indirect transfer from farm animals exposed to naturally contaminated grains and feed (carry-over effects) or direct contamination with moulds or naturally contaminated spice mixtures used in meat production. Ochratoxin A (OTA) is mycotoxin considered to be of the outermost importance from the public health standpoint in connection with meat products. The aim of this study was to investigate the occurrence of OTA in different traditional meat products circulating on Croatian markets during 2018, produced by a large number of households situated in eastern and north Croatian regions using a variety of technologies. Concentrations of OTA were determined in traditional meat products (n = 70), including dry fermented sausages (Slavonian kulen, Slavonian sausage, Istrian sausage and domestic sausage; n = 28), dry-cured meat products (pancetta, pork rack and ham; n = 22) and cooked sausages (liver sausages, black pudding sausages and pate; n = 20). OTA was analyzed by use of quantitative screening immunoassay method (ELISA) and confirmed for positive samples (higher than the limit of detection) by liquid chromatography tandem mass spectrometry (LC-MS/MS) method. Whereas the bacon samples contaminated with OTA were not found, its level in dry fermented sausages ranged from 0.22 to 2.17 µg/kg and in dry-cured meat products from 0.47 to 5.35 µg/kg, with in total 9% of positive samples. Besides possible primary contamination of these products arising due to improper manufacturing or/and storage conditions, observed OTA contamination could also be the consequence of secondary contamination that comes as a result of contaminated feed the animals were fed on. OTA levels obtained in cooked sausages ranged from 0.32 to 4.12 µg/kg (5% of positives) and could probably be linked to the contaminated raw materials (liver, kidney and spices) used in the sausages production. The results showed an occasional OTA contamination of traditional meat products, pointing that to avoid such contamination on households these products should be produced and processed under standardized and well-controlled conditions. Further investigations should be performed in order to identify mycotoxin-producing moulds on the surface of the products and to define preventative measures that can reduce the contamination of traditional meat products during their production on households and period of storage.

Keywords: Croatian households, ochratoxin-A, traditional cooked sausages, traditional dry-cured meat products

Procedia PDF Downloads 193
7297 Dynamic Test for Stability of Bar Loaded by a Compression Force Directed Towards the Pole

Authors: Elia Efraim, Boris Blostotsky

Abstract:

The phenomenon of buckling of structural elements under compression is revealed in many cases of loading and found consideration in many structures and mechanisms. In the present work the method and results of dynamic test for buckling of bar loaded by a compression force directed towards the pole are considered. Experimental determination of critical force for such system has not been made previously. The tested object is a bar with semi-rigid connection to the base at one of its ends, and with a hinge moving along a circle at the other. The test includes measuring the natural frequency of the bar at different values of compression load. The lateral stiffness is calculated based on natural frequency and reduced mass on the bar's movable end. The critical load is determined by extrapolation the values of lateral stiffness up to zero value. For the experimental investigation the special test-bed was created that allows the stability testing at positive and negative curvature of the movable end's trajectory, as well as varying the rotational stiffness of the other end connection. Decreasing a friction at the movable end allows extend the diapason of applied compression force. The testing method includes : - methodology of the experiment planning, that allows determine the required number of tests under various loads values in the defined range and the type of extrapolating function; - methodology of experimental determination of reduced mass at the bar's movable end including its own mass; - methodology of experimental determination of lateral stiffness of uncompressed bar rotational semi-rigid connection at the base. For planning the experiment and for comparison of the experimental results with the theoretical values of critical load, the analytical dependencies of lateral stiffness of the bar with defined end conditions on compression load. In the particular case of perfectly rigid connection of the bar to the base, the critical load value corresponds to solution by S.P. Timoshenko. Correspondence of the calculated and experimental values was obtained.

Keywords: buckling, dynamic method, end-fixity factor, force directed towards a pole

Procedia PDF Downloads 351
7296 Medical Image Compression by Region of Interest Based on DT-CWT Using Run-length Coding and Huffman Coding

Authors: Ali Seddiki, Mohamed Djebbouri, Driss Guerchi

Abstract:

Medical imaging produces human body pictures in digital form. Since these imaging techniques produce prohibitive amounts of data, compression is necessary for storage and communication purposes. In some areas in medicine, it may be sufficient to maintain high image quality only in region of interest (ROI). This paper discusses a contribution to quality purpose compression in the region of interest of scintigraphic images based on dual tree complex wavelet transform (DT-CWT) using Run-Length coding (RLE) and Huffman coding (HC).

Keywords: DT-CWT, region of interest, run length coding, Scintigraphic images

Procedia PDF Downloads 282
7295 Natural Fibers Design Attributes

Authors: Brayan S. Pabón, R. Ricardo Moreno, Edith Gonzalez

Abstract:

Inside the wide Colombian natural fiber set is the banana stem leaf, known as Calceta de Plátano, which is a material present in several regions of the country and is a fiber extracted from the pseudo stem of the banana plant (Musa paradisiaca) as a regular maintenance process. Colombia had a production of 2.8 million tons in 2007 and 2008 corresponding to 8.2% of the international production, number that is growing. This material was selected to be studied because it is not being used by farmers due to it being perceived as a waste from the banana harvest and a propagation pest agent inside the planting. In addition, the Calceta does not have industrial applications in Colombia since there is not enough concrete knowledge that informs us about the properties of the material and the possible applications it could have. Based on this situation the industrial design is used as a link between the properties of the material and the need to transform it into industrial products for the market. Therefore, the project identifies potential design attributes that the banana stem leaf can have for product development. The methodology was divided into 2 main chapters: Methodology for the material recognition: -Data Collection, inquiring the craftsmen experience and bibliography. -Knowledge in practice, with controlled experiments and validation tests. -Creation of design attributes and material profile according to the knowledge developed. Moreover, the Design methodology: -Application fields selection, exploring the use of the attributes and the relation with product functions. -Evaluating the possible fields and selection of the optimum application. -Design Process with sketching, ideation, and product development. Different protocols were elaborated to qualitatively determine some material properties of the Calceta, and if they could be designated as design attributes. Once defined, performed and analyzed the validation protocols, 25 design attributes were identified and classified into 4 attribute categories (Environmental, Functional, Aesthetics and Technical) forming the material profile. Then, 15 application fields were defined based on the relation between functions of product and the use of the Calceta attributes. Those fields were evaluated to measure how much are being used the functional attributes. After fields evaluation, a final field was defined , influenced by traditional use of the fiber for packing food. As final result, two products were designed for this application field. The first one is the Multiple Container, which works to contain small or large-thin pieces of food, like potatoes chips or small sausages; it allows the consumption of food with sauces or dressings. The second is the Chorizo container, specifically designed for this food due to the long shape and the consumption mode. Natural fiber research allows the generation of a solider and a more complete knowledge about natural fibers. In addition, the research is a way to strengthen the identity through the investigation of the proper and autochthonous, allowing the use of national resources in a sustainable and creative way. Using divergent thinking and the design as a tool, this investigation can achieve advances in the natural fiber handling.

Keywords: banana stem leaf, Calceta de Plátano, design attributes, natural fibers, product design

Procedia PDF Downloads 259
7294 Surgical Hip Dislocation of Femoroacetabular Impingement: Survivorship and Functional Outcomes at 10 Years

Authors: L. Hoade, O. O. Onafowokan, K. Anderson, G. E. Bartlett, E. D. Fern, M. R. Norton, R. G. Middleton

Abstract:

Aims: Femoroacetabular impingement (FAI) was first recognised as a potential driver for hip pain at the turn of the last millennium. While there is an increasing trend towards surgical management of FAI by arthroscopic means, open surgical hip dislocation and debridement (SHD) remains the Gold Standard of care in terms of reported outcome measures. (1) Long-term functional and survivorship outcomes of SHD as a treatment for FAI are yet to be sufficiently reported in the literature. This study sets out to help address this imbalance. Methods: We undertook a retrospective review of our institutional database for all patients who underwent SHD for FAI between January 2003 and December 2008. A total of 223 patients (241 hips) were identified and underwent a ten year review with a standardised radiograph and patient-reported outcome measures questionnaire. The primary outcome measure of interest was survivorship, defined as progression to total hip arthroplasty (THA). Negative predictive factors were analysed. Secondary outcome measures of interest were survivorship to further (non-arthroplasty) surgery, functional outcomes as reflected by patient reported outcome measure scores (PROMS) scores, and whether a learning curve could be identified. Results: The final cohort consisted of 131 females and 110 males, with a mean age of 34 years. There was an overall native hip joint survival rate of 85.4% at ten years. Those who underwent a THA were significantly older at initial surgery, had radiographic evidence of preoperative osteoarthritis and pre- and post-operative acetabular undercoverage. In those whom had not progressed to THA, the average Non-arthritic Hip Score and Oxford Hip Score at ten year follow-up were 72.3% and 36/48, respectively, and 84% still deemed their surgery worthwhile. A learning curve was found to exist that was predicated on case selection rather than surgical technique. Conclusion: This is only the second study to evaluate the long-term outcomes (beyond ten years) of SHD for FAI and the first outside the originating centre. Our results suggest that, with correct patient selection, this remains an operation with worthwhile outcomes at ten years. How the results of open surgery compared to those of arthroscopy remains to be answered. While these results precede the advent of collison software modelling tools, this data helps set a benchmark for future comparison of other techniques effectiveness at the ten year mark.

Keywords: femoroacetabular impingement, hip pain, surgical hip dislocation, hip debridement

Procedia PDF Downloads 84
7293 Evolution of Propiconazole and Tebuconazole Residues through the Post-Harvest Application in 'Angeleno' Plum

Authors: M. J. Rodríguez, F. M. Sánchez, B. Velardo, P. Calvo, M. J. Serradilla, J. Delgado, J. M. López

Abstract:

The main problems in storage and later transport of fruits, are the decays developed that reduce the quality on destination’s markets. Nowadays, there is an increasing interest in the use of compounds to avoid decays in post-harvest. Triazole fungicides are agrochemicals widely used in the agricultural industry due to their wide spectrum of actions, and in some case, they are used in citrus fruit post-harvest. Moreover, its use is not authorized in plum post-harvest, but in order to a future possible authorization, the evolutions of propiconazole and tebuconazole residues are studied after its post-harvest application in ‘Angeleno’ plum.

Keywords: maximum residue limit (MRL), triazole fungicides, decay, Prunus salicina

Procedia PDF Downloads 316
7292 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology

Authors: Yonggu Jang, Jisong Ryu, Woosik Lee

Abstract:

The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.

Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities

Procedia PDF Downloads 62
7291 Method for Auto-Calibrate Projector and Color-Depth Systems for Spatial Augmented Reality Applications

Authors: R. Estrada, A. Henriquez, R. Becerra, C. Laguna

Abstract:

Spatial Augmented Reality is a variation of Augmented Reality where the Head-Mounted Display is not required. This variation of Augmented Reality is useful in cases where the need for a Head-Mounted Display itself is a limitation. To achieve this, Spatial Augmented Reality techniques substitute the technological elements of Augmented Reality; the virtual world is projected onto a physical surface. To create an interactive spatial augmented experience, the application must be aware of the spatial relations that exist between its core elements. In this case, the core elements are referred to as a projection system and an input system, and the process to achieve this spatial awareness is called system calibration. The Spatial Augmented Reality system is considered calibrated if the projected virtual world scale is similar to the real-world scale, meaning that a virtual object will maintain its perceived dimensions when projected to the real world. Also, the input system is calibrated if the application knows the relative position of a point in the projection plane and the RGB-depth sensor origin point. Any kind of projection technology can be used, light-based projectors, close-range projectors, and screens, as long as it complies with the defined constraints; the method was tested on different configurations. The proposed procedure does not rely on a physical marker, minimizing the human intervention on the process. The tests are made using a Kinect V2 as an input sensor and several projection devices. In order to test the method, the constraints defined were applied to a variety of physical configurations; once the method was executed, some variables were obtained to measure the method performance. It was demonstrated that the method obtained can solve different arrangements, giving the user a wide range of setup possibilities.

Keywords: color depth sensor, human computer interface, interactive surface, spatial augmented reality

Procedia PDF Downloads 124
7290 Aligning Informatics Study Programs with Occupational and Qualifications Standards

Authors: Patrizia Poscic, Sanja Candrlic, Danijela Jaksic

Abstract:

The University of Rijeka, Department of Informatics participated in the Stand4Info project, co-financed by the European Union, with the main idea of an alignment of study programs with occupational and qualifications standards in the field of Informatics. A brief overview of our research methodology, goals and deliverables is shown. Our main research and project objectives were: a) development of occupational standards, qualification standards and study programs based on the Croatian Qualifications Framework (CROQF), b) higher education quality improvement in the field of information and communication sciences, c) increasing the employability of students of information and communication technology (ICT) and science, and d) continuously improving competencies of teachers in accordance with the principles of CROQF. CROQF is a reform instrument in the Republic of Croatia for regulating the system of qualifications at all levels through qualifications standards based on learning outcomes and following the needs of the labor market, individuals and society. The central elements of CROQF are learning outcomes - competences acquired by the individual through the learning process and proved afterward. The place of each acquired qualification is set by the level of the learning outcomes belonging to that qualification. The placement of qualifications at respective levels allows the comparison and linking of different qualifications, as well as linking of Croatian qualifications' levels to the levels of the European Qualifications Framework and the levels of the Qualifications framework of the European Higher Education Area. This research has made 3 proposals of occupational standards for undergraduate study level (System Analyst, Developer, ICT Operations Manager), and 2 for graduate (master) level (System Architect, Business Architect). For each occupational standard employers have provided a list of key tasks and associated competencies necessary to perform them. A set of competencies required for each particular job in the workplace was defined and each set of competencies as described in more details by its individual competencies. Based on sets of competencies from occupational standards, sets of learning outcomes were defined and competencies from the occupational standard were linked with learning outcomes. For each learning outcome, as well as for the set of learning outcomes, it was necessary to specify verification method, material, and human resources. The task of the project was to suggest revision and improvement of the existing study programs. It was necessary to analyze existing programs and determine how they meet and fulfill defined learning outcomes. This way, one could see: a) which learning outcomes from the qualifications standards are covered by existing courses, b) which learning outcomes have yet to be covered, c) are they covered by mandatory or elective courses, and d) are some courses unnecessary or redundant. Overall, the main research results are: a) completed proposals of qualification and occupational standards in the field of ICT, b) revised curricula of undergraduate and master study programs in ICT, c) sustainable partnership and association stakeholders network, d) knowledge network - informing the public and stakeholders (teachers, students, and employers) about the importance of CROQF establishment, and e) teachers educated in innovative methods of teaching.

Keywords: study program, qualification standard, occupational standard, higher education, informatics and computer science

Procedia PDF Downloads 143
7289 Development and Verification of the Idom Shielding Optimization Tool

Authors: Omar Bouhassoun, Cristian Garrido, César Hueso

Abstract:

The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.

Keywords: optimization, shielding, nuclear, genetic algorithm

Procedia PDF Downloads 110
7288 Contribution to the Analytical Study of the Stability of a DC-DC Converter (Boost) Used for MPPT Control

Authors: Mohamed Amarouayache, Badia Amrouche, Gharbi Akila, Boukadoume Mohamed

Abstract:

This work is devoted to the modeling of DC-DC converter (boost) used for MPPT applications to set conditions of stability. For this, we establish a linear mathematical model of the DC-DC converter with an average small signal model. This model has allowed us to apply conventional linear methods of automation. A mathematical relationship between the duty cycle and the voltage of the panel has been set up. With this relationship we specify the conditions of the stability in closed-loop depending on the system parameters (the elements of storage capacity and inductance, PWM control).

Keywords: MPPT, PWM, stability, criterion of Routh, average small signal model

Procedia PDF Downloads 444
7287 Solar Power Generation in a Mining Town: A Case Study for Australia

Authors: Ryan Chalk, G. M. Shafiullah

Abstract:

Climate change is a pertinent issue facing governments and societies around the world. The industrial revolution has resulted in a steady increase in the average global temperature. The mining and energy production industries have been significant contributors to this change prompting government to intervene by promoting low emission technology within these sectors. This paper initially reviews the energy problem in Australia and the mining sector with a focus on the energy requirements and production methods utilised in Western Australia (WA). Renewable energy in the form of utility-scale solar photovoltaics (PV) provides a solution to these problems by providing emission-free energy which can be used to supplement the existing natural gas turbines in operation at the proposed site. This research presents a custom renewable solution for the mining site considering the specific township network, local weather conditions, and seasonal load profiles. A summary of the required PV output is presented to supply slightly over 50% of the towns power requirements during the peak (summer) period, resulting in close to full coverage in the trench (winter) period. Dig Silent Power Factory Software has been used to simulate the characteristics of the existing infrastructure and produces results of integrating PV. Large scale PV penetration in the network introduce technical challenges, that includes; voltage deviation, increased harmonic distortion, increased available fault current and power factor. Results also show that cloud cover has a dramatic and unpredictable effect on the output of a PV system. The preliminary analyses conclude that mitigation strategies are needed to overcome voltage deviations, unacceptable levels of harmonics, excessive fault current and low power factor. Mitigation strategies are proposed to control these issues predominantly through the use of high quality, made for purpose inverters. Results show that use of inverters with harmonic filtering reduces the level of harmonic injections to an acceptable level according to Australian standards. Furthermore, the configuration of inverters to supply active and reactive power assist in mitigating low power factor problems. Use of FACTS devices; SVC and STATCOM also reduces the harmonics and improve the power factor of the network, and finally, energy storage helps to smooth the power supply.

Keywords: climate change, mitigation strategies, photovoltaic (PV), power quality

Procedia PDF Downloads 166
7286 Developing Indicators in System Mapping Process Through Science-Based Visual Tools

Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski

Abstract:

The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.

Keywords: indicators, knowledge management, system mapping, visual tools

Procedia PDF Downloads 195
7285 Lockit: A Logic Locking Automation Software

Authors: Nemanja Kajtez, Yue Zhan, Basel Halak

Abstract:

The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).

Keywords: design automation, hardware security, IP piracy, logic locking

Procedia PDF Downloads 183
7284 A Strategy to Oil Production Placement Zones Based on Maximum Closeness

Authors: Waldir Roque, Gustavo Oliveira, Moises Santos, Tatiana Simoes

Abstract:

Increasing the oil recovery factor of an oil reservoir has been a concern of the oil industry. Usually, the production placement zones are defined after some analysis of geological and petrophysical parameters, being the rock porosity, permeability and oil saturation of fundamental importance. In this context, the determination of hydraulic flow units (HFUs) renders an important step in the process of reservoir characterization since it may provide specific regions in the reservoir with similar petrophysical and fluid flow properties and, in particular, techniques supporting the placement of production zones that favour the tracing of directional wells. A HFU is defined as a representative volume of a total reservoir rock in which petrophysical and fluid flow properties are internally consistent and predictably distinct of other reservoir rocks. Technically, a HFU is characterized as a rock region that exhibit flow zone indicator (FZI) points lying on a straight line of the unit slope. The goal of this paper is to provide a trustful indication for oil production placement zones for the best-fit HFUs. The FZI cloud of points can be obtained from the reservoir quality index (RQI), a function of effective porosity and permeability. Considering log and core data the HFUs are identified and using the discrete rock type (DRT) classification, a set of connected cell clusters can be found and by means a graph centrality metric, the maximum closeness (MaxC) cell is obtained for each cluster. Considering the MaxC cells as production zones, an extensive analysis, based on several oil recovery factor and oil cumulative production simulations were done for the SPE Model 2 and the UNISIM-I-D synthetic fields, where the later was build up from public data available from the actual Namorado Field, Campos Basin, in Brazil. The results have shown that the MaxC is actually technically feasible and very reliable as high performance production placement zones.

Keywords: hydraulic flow unit, maximum closeness centrality, oil production simulation, production placement zone

Procedia PDF Downloads 329