Literature DB >> 21745496

Minimum Information about a Cardiac Electrophysiology Experiment (MICEE): standardised reporting for model reproducibility, interoperability, and data sharing.

T A Quinn1, S Granite, M A Allessie, C Antzelevitch, C Bollensdorff, G Bub, R A B Burton, E Cerbai, P S Chen, M Delmar, D Difrancesco, Y E Earm, I R Efimov, M Egger, E Entcheva, M Fink, R Fischmeister, M R Franz, A Garny, W R Giles, T Hannes, S E Harding, P J Hunter, G Iribe, J Jalife, C R Johnson, R S Kass, I Kodama, G Koren, P Lord, V S Markhasin, S Matsuoka, A D McCulloch, G R Mirams, G E Morley, S Nattel, D Noble, S P Olesen, A V Panfilov, N A Trayanova, U Ravens, S Richard, D S Rosenbaum, Y Rudy, F Sachs, F B Sachse, D A Saint, U Schotten, O Solovyova, P Taggart, L Tung, A Varró, P G Volders, K Wang, J N Weiss, E Wettwer, E White, R Wilders, R L Winslow, P Kohl.   

Abstract

Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step towards establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work.
Copyright © 2011 Elsevier Ltd. All rights reserved.

Entities:  

Mesh:

Year:  2011        PMID: 21745496      PMCID: PMC3190048          DOI: 10.1016/j.pbiomolbio.2011.07.001

Source DB:  PubMed          Journal:  Prog Biophys Mol Biol        ISSN: 0079-6107            Impact factor:   3.667


INTRODUCTION

Here, we present a draft Minimum Information Standard for recording, annotating, and reporting experimental cardiac electrophysiology data, which we are calling the Minimum Information about a Cardiac Electrophysiology Experiment (MICEE) standard. The concept is that for relevant studies, this information will be made available in an online repository and referenced in any related publications. Our hope is that this reporting standard will develop into a tool used by the experimental cardiac electrophysiology community to facilitate and improve recording and dissemination of the minimum information necessary for reproduction of cardiac electrophysiology experimental research, via contextualisation to allow for easier comparison and usage of findings by others, and to enhance the integration of results into other experimental, computational, and conceptual models. Throughout the scientific community, there is growing recognition that open-access data-sharing promotes research transparency, assessment and validation of experimental data, and design of new experiments, furthering discovery from past work and the development of broader computational and/or conceptual models that are based firmly on experimental insight (Smith and Noble, 2008). This is reflected by the current requirements of some funding agencies and journals for data sharing, as well as the concerted efforts of various institutions in its promotion and implementation (Cragin et al., 2010; Nelson, 2009). While there are examples of very useful data sharing resources, such as the database of Genotypes and Phenotypes (dbGAP; http://www.ncbi.nlm.nih.gov/gap/) for storing genome-wide association study data, or the Gene Expression Omnibus (GEO; http://www.ncbi.nlm.nih.gov/geo/) for mRNA data, many real and perceived barriers need to be overcome before such resources can achieve their full potential. These include reluctance to contribute community data that has taken years to collect, concerns about data misuse and/or misattribution, worries about intellectual property rights associated with data, and the additional time, effort, and resources required to make data and their contextualisation via meta-data accessible by others (Cragin et al., 2010; Nelson, 2009). An additional fundamental problem is a lack of clear and useful reporting standards and associated infrastructure. Minimum Information Standards and reporting guidelines are now recognized as an important step towards establishing effective data use and re-use, thus optimising data utilisation and enabling experimental reproducibility – something that is already an explicit requirement for the scientific research and communication process. Any useful set of reporting standards is necessarily discipline-specific, describing what raw- and meta-data should be made available, and how this should be formatted for general use, so that necessary and sufficient information is provided to allow reproduction of experimental interventions and study procedures. While this is critical for well-informed evaluation of results and conclusions, the associated overhead should remain minimal, to encourage compliance (Taylor et al., 2007). The identification of a minimally necessary and sufficient set of parameters is a difficult task, confounded by the overwhelming diversity of scientific practices and information in any given field. In recent years, there has been a growing interest in identifying formalised reporting requirements for experimental and computational research. Current efforts are being brought together under the Minimum Information about a Biomedical or Biological Investigation (MIBBI) umbrella (http://www.mibbi.org/), aimed at uniting the various communities developing Minimum Information Standards for the description of data sets and the workflows by which they were generated (Kettner et al., 2010; Taylor et al., 2008). Currently, however, no set of reporting standards exist for cardiac electrophysiology experimentation, contributing to a lack of consistency in the information reported upon publication. This has resulted from neither negligence nor ill intent. Constraints on time and resources, as well as outlet-specific content and formatting demands, make the task of reporting in a standardised fashion appear burdensome and (possibly) not worth the extra effort. One might regard it as ironic, that the current mode may in fact be a larger drain on time and resources for the community overall, than the alternative. To reproduce experiments from published methods sections in the literature is, by and large, not possible without in-depth knowledge of all materials, procedures, and interventions (which will be rare in fields with a low proportion of ‘routine’ research activities). This situation has been made worse by the progressive reduction in space allocated to the description of methods in many journals (in some cases this has been partly remedied by online supplemental information, although standardisation of such sections might still aid experimental reproducibility). Lack of reporting standards also makes it particularly difficult to enable data utilisation across fields, such as by computational modellers who may be less familiar with determinants of experimental studies that are ‘at the fringes’ of experimental design (while pH or ambient temperature may be obvious parameters to watch out for, osmotic pressure of solutions or the supplier of a transgenic strain may feature less prominently on the list of possible confounding aspects). Furthermore, ‘negative’ results, i.e., the finding that a particular intervention does not give rise to a hypothesised response, are published far too rarely (even though the only thing ‘negative’ about these data are that they do not reach the public domain), such that positive results, even when scarce, may dominate perception. This results in an abundance of inadvertently repeated experiments and a profound publication bias that hampers scientific understanding (Schooler, 2011), although there are current efforts to correct this (such as with the Journal of Negative Results in Biomedicine; http://www.jnrbm.com/). Thus, standardised reporting guidelines may help to ensure availability of the information needed to reproduce a study, or to not attempt it, avoiding wasted time and resources, which increases overall productivity. Additionally, increased emphasis on the integration of insight from different levels of structural complexity (Kohl et al., 2010), and a renewed focus on the translation of information learned through basic science to the clinic, requires more stringent control and documentation of experimental conditions and protocols (especially important in the post-genomic era, with the increasingly common use of small animal models to mimic human conditions and to explore treatment possibilities). Careful consideration should be paid to what are seemingly inevitable experimental restrictions, such as caused by sub-optimal experimental design, systematic experimental error, and parameter variations outside the control of the experimentalist. This will also benefit efforts to conduct quantitative analysis and computational modelling, by facilitating inclusion of important parameters that potentially influence results, such as factors accounting for subject specific differences (e.g., age and sex). While one cannot predict all of the information that might be necessary for post hoc computational and/or conceptual ‘modelling’ - especially with the rapid evolution of this field - having reported what is currently understood to constitute the most important factors contributing to an experimental outcome will be of significant utility for the identification and validation of novel hypotheses (Greenstein and Winslow, 2011; Rudy, 2000).

PROPOSED DRAFT OF A MINIMUM INFORMATION STANDARD FOR CARDIAC ELECTROPHYSIOLOGY EXPERIMENTATION

The goal of this paper is to present a draft of a Minimum Information Standard for cardiac electrophysiology experimentation. This has been modelled after the Minimum Information about a Neuroscience Investigation (MINI; http://www.carmen.org.uk/standards) standard (Gibson et al., 2009), but tailored for the specific needs of cardiac electrophysiology. It contains a draft of what is believed to be an explicit minimum set of information that is necessary for reproduction of experimental cardiac electrophysiology research and its integration into other experimental or computational models, while hopefully remaining general enough to cover a majority of cases in the field. A significant proportion of this information would normally already appear in the Methods sections of publications. Nonetheless, it has been included here, as having all information in one place will improve efficiency of access. The MICEE standard has been organised into the following five sections, which are believed to encapsulate the most important aspects of the majority of cardiac electrophysiology experiments: Material Environment Protocols Recordings Analysis Below we describe the rationale for these sections, and the general information essential to each of them, in order to clarify the content of the proposed draft reporting standard, and to aid broader discussion and further development of the proposal. The complete MICEE draft standard can be found in Appendix A. The described reporting standard is ‘a draft sequence’, and very much open to further development in the light of community needs and preferences. We do not specifically discuss each individual element, but hope that all elements follow from the principles discussed above. Finally, to illustrate the utility of the MICEE standard, an example (using a study recently published by some of the authors (Iribe et al., 2009)) is given in Appendix B, which highlights the need for information not contained in ‘the usual’ Methods section.

1. Material

This section gives details of the subject(s) under investigation. Depending on the nature of the study, the type(s) may be human, whole animal, isolated heart, isolated or engineered tissue, isolated, cultured, or stem cells, or cell fragments (e.g., membrane patches), and subheadings are provided for each. Each of these subheadings has its own specific characteristics, relating to features that are increasingly recognized as important to cardiac electrophysiology (e.g., sex, developmental stage, genetic variation, disease background, and husbandry, including diet, environmental enrichment, and light cycle). Additionally, it includes information about sample preparation and maintenance, focusing on aspects such as method of animal dispatch, anatomical origin of the sample, isolation procedure, cell selection process, and growth, culture, and differentiating conditions. This information is essential to the outcome of cardiac electrophysiology studies, as it is arguably one of the most important acute determinants of the quality, viability, and reproducibility of experimental model systems.

2. Environment

Information contained in this section, relating to environmental conditions in which an experiment is conducted, is also vital to the interpretation and comparison of cardiac electrophysiology results, but is often not well-controlled or monitored (e.g., ‘room temperature’), with specific details underreported in publications (and perhaps increasingly so, which would be a worrying trend). Included factors range from sample temperature (e.g., temperature at the site of experimentation, not in a fluid reservoir for example) and solution characteristics, to flow rates, bath volume, and details about the presence of chemicals, dyes, gases, or drugs. This not only makes information available for later study verification, but also highlights the importance of a range of parameters for experimental control, potentially encouraging closer monitoring of relevant conditions, where possible.

3. Protocols

This heading provides a description of the experimental protocols of a study. Including detailed descriptions of experimental procedures is becoming progressively more important, as an increasing number of journals are either reducing the space provided for publishing this information (often due to economical and citation-impact related pressures), or relegating it to electronic add-on resources. It is by necessity less specific than other sections, requiring a sufficiently detailed account of procedures and interventions, as cardiac electrophysiology draws on an extremely wide array of experimental techniques and model systems, often with laboratories following their own individually-tailored protocols. Also, this is the area where scientific originality is, perhaps, the most important driver of progress. As such, the prescription of a firm reporting standard for information of this type is neither possible nor desirable.

4. Recordings

This section addresses the specifics of equipment and software used to record and pre-process signals in an experiment, including relevant parameters of operation. The importance of this information may not be as self-evident as other aspects described above, which may result in severe under-reporting in publications. This includes features such as detailed description of timing control, data sampling rates, filtering and smoothing, bit depth, gain, and dynamic range, all of which can greatly affect the nature and information content of data. For example, with patch-clamp recordings, technical aspects are essential for appropriate application of the technique and errors in factors such as series resistance and voltage-clamp control can lead to errors in the basic properties of currents, resulting in misinterpretation of results and misleading conclusions.

5. Analysis

This part of the reporting standard provides information on the software and methods used in data processing to extract information, including details of post hoc filtering, normalisation, interpolation, inclusion/exclusion criteria, n number(s), and statistical methods. Its importance is fairly clear, as outcomes can be significantly altered by data manipulation, but still, detail provided in publications tends to be insufficient for adequate reproduction. An additional feature of this section is the inclusion of example(s) of raw and processed data (from the same recording), which will allow others to assess whether they are able to replicate described approaches (and which is also often omitted from publications).

IMPLEMENTING AND DEVELOPING THE MICEE STANDARD

It is important to repeat that this reporting standard is meant, in its present form, as a place to start. The set of minimum information must develop from experience and input from the greater community, which may include both growth and reduction of currently envisaged categories and parameters. The hope is that, with time, adherence to minimum reporting standards will become second nature, as is the current expectation that the composition of solutions and their pH form part of any methods section in this field. This would help to address some of the challenges associated with data sharing, experimental reproducibility, model interrelation, and correlation of experimental and computational studies in cardiac electrophysiology research. The concept is also that the MICEE repository, discussed below, will allow for dissemination of unpublished (and thus less publically available) results, such as those described in PhD theses and unreported ‘negative’ findings. This may avoid repetition of experiments and improve scientific understanding, and when pertinent, can be cited in future publications. Progress could be facilitated by a research program to catalogue past work (similar to what has been done for a single recent study in Appendix B). Such shared access to ‘retrospective’ communications has been developed, with significant success, for computational cardiac electrophysiology models, which is benefiting from the increasing use of a standardised format for communication and modelling (Nickerson and Buist, 2009), called Cell Markup Language (CellML) (Cuellar et al., 2003). The CellML model repository now contains over 250 cardiac electrophysiology cell models (see http://models.cellml.org/electrophysiology/), curated and tested to different levels, making models and associated meta-data (like original publications) easily accessible. Once the reporting standard begins to converge, it will be important to incorporate it into the MIBBI framework (see http://www.mibbi.org/index.php/Projects/MICEE) and to work with other communities to explore standardized nomenclatures and combined workflow elements, to avoid double work and incompatibility of outputs. For instance, the Virtual Physiological Human (VPH) (Fenner et al., 2008; Hunter et al., 2010; Hunter and Viceconti, 2009; Kohl and Noble, 2009) and Physiome (Bassingthwaighte et al., 2009; Bassingthwaighte, 1997; Hunter et al., 2002; Smith et al., 2009) projects are promoting the development of model and data encoding standards for the computational modelling community, along with their associated minimum information requirements. Efforts are also underway to establish uniform data standards for clinical cardiovascular electrophysiology studies and procedures, to serve as a basis for research and practice databases (Buxton et al., 2006; Weintraub et al., 2011). It will be essential to promote compatibility with these activities, especially for use of experimental data in computational model building and validation. Additionally, it could prove helpful if the formal reporting standard – once endorsed more broadly by the community – would be adopted by one or more professional societies. Equally crucial will be the question whether leading journals in the field may be convinced to identify ‘MICEE-compatible data reporting’ as a desirable approach. Most importantly, beyond the desire to increase awareness of the need for Minimum Information Standards in cardiac electrophysiology experimentation, we intend to initiate action. Thus, the authors of this communication are making a commitment to adhere to the proposed reporting standard for a twelve-month period, starting at the beginning of 2012, by recording the then identified MICEE information for all of their relevant studies. Upon study completion, this information will be made available in a repository maintained by the Johns Hopkins University CardioVascular Research Grid (accessible at http://www.micee.org/). When relevant, MICEE entries will link-out to the digital object identifiers (DOI) of publications, and be referenced in the related papers with a citable identification. This test of utility will help in assessing and shaping the MICEE approach, and we invite others in the community to join us in this effort. We also request feedback on how the reporting standard might be improved, which will be possible via a public notice board on the MICEE.org website, to facilitate community discussion. Finally, once the standard begins to gain broader acceptance by cardiac electrophysiologists, an oversight committee will be established to manage the process of standard refinement and future extensions of MICEE.

PRESENT DIFFICULTIES AND CHALLENGES AHEAD

Even amongst those who believe Minimum Information Standards are necessary and important, a common argument against their development is that “it is a nearly impossible task”. Other valid criticisms include the concern that their implementation is associated with too much work, or – conversely – that they do not go far enough. However, if one regards the status quo as not ideal, it is hard to argue that useful progress could not be made. It is obvious that emergence of a complete consensus by a research community on any reporting standard is highly unlikely. This applies to the proposed MICEE standard, and it includes the authors of this paper. There is, however, agreement amongst the authors that there is a need to agree on, and define (standardise) the minimum information needs for cardiac electrophysiology experimentation. We realise that a complete description of any experiment is unachievable, but believe that the proposed standard encompasses key features necessary for the effective use of information by other researchers. Besides, ‘exact’ repetition of an experiment with identical conditions, even by the original experimentalist, is in itself improbable (and not usually warranted or desired). Proper documentation of the factors that may be most important to experimental outcomes, however, is an attainable and relevant goal. It is clear that convergence to an agreement on a ‘final’ MICEE standard will need time, but once a standard has been accepted, the question remains as to the best ways of encouraging ‘compliance’. As with most change, a combination of ‘stick and carrot’ tends to be most productive. Wielding the stick, one could imagine an approach where those who have the authority demand compliance. Examples would include funding agencies (which can make it a condition of support), scientific societies (which can establish it as a precedent), and journals (which can make it part of publication policies, or simply formalise their methods sections and online supplements to provide information congruent to the proposed standard). By and large, it seems that scientists generally do not respond well to (new) dogmas and demands, as even widely accepted (and exceedingly valuable) precedents, for instance the système international d’unités (SI), have had (and still have) a hard time to penetrate certain traditional barriers. Ultimately, the key question is: “what is in it for me?”. If and when a new tool (e.g., a reporting standard) proves to be productive and has clear value, for example saving time, effort, and resources, it turns itself into the ‘carrot’. A useful example of this is the now widely-accepted standardisation approach in the Systems Biology field, the Systems Biology Markup Language (SBML) (Hucka et al., 2003). The trick, then, will be to develop MICEE to a level where it becomes a tool of utility. Therefore, the MICEE standard is a form of self-regulation, shaped by the greater community, such that the final product will be formed by end-users, with the aim of making it a useful time saving measure, rather than a hindrance. In this context, the goal is also for it to be useful for researchers in creating ‘internal’ meta-data collections for continued work, sharing among collaborators, and eventual publication. This will be additionally important for its effectiveness as a time saving device, as collection of data at-the-time-of-study will facilitate its later dissemination. For this, a scientist controlled embargo system will be essential (Cragin et al., 2010), and emulating the functionality of existing ‘staging repository’ tools, such as the Data Staging Repository (DataStar; http://datastar.mannlib.cornell.edu/), may be a constructive approach. Attitudes towards reporting standards and their implementation are changing in many other areas of bioscience research, spearheaded by an active and organised minimum information community: the MIBBI portal currently lists 32 Minimum Information Standards (see http://www.mibbi.org/index.php/MIBBI_portal). Common to those reporting standards that have been successful is the availability of technical support, in the form of software for formatting experimental data and recording associated meta-data and repositories for deposition, storage, and retrieval of this information, including software and user-interfaces for efficient database searches and data exportation (with links to publications and cross-links to other experiments and sources of information). In general, there are three necessary elements for reporting standard utilisation: (i) definition of the Minimum Information Standard, (ii) a syntax for expression of data, and (iii) a meta-data standard for semantics (via ontologies to ensure the use of accepted terminology). Our aim, at this point, is to propose and develop (i). In the near future, this will need to be followed by (ii) and (iii), to ensure efficient automated search processes. For this, an XML-based standard for time varying data will be useful, such as is being developed through the BioSignal Markup Language (BioSignalML) (Brooks, 2009). Ultimately, further development will require a commitment from national, regional, and/or private funding agencies, and while resources are always in short supply, cost-benefit considerations suggest that this would be in the best interest of all involved. As always, it is helpful to try to learn from the experience of previous minimum information efforts. The pioneering, and maybe most successful, example of a reporting standard was published 10 years ago, the Minimum Information About a Microarray Experiment (MIAME) standard (Brazma et al., 2001). The assertion at the time was that, to make data usable for analysis, everything relevant had to be recorded systematically (Brazma, 2009). Perhaps most important to its success was the fact that a majority of scientific journals made submission of MIAME-compliant data to public repositories mandatory. Also essential was its intuitive interface, where users could place queries to search databases. The relevant databases (for instance dbGAP), curate, analyse, and transform microarray data, making it widely accessible. However, even with the general adoption of MIAME principles, it can be difficult to obtain desired microarray data (Ioannidis et al., 2009), which has been attributed mainly to the fact that the initial lack of a standard computer-readable formats for representing information has limited its utility (Brazma, 2009). This has been improved by specification of formats by the Functional Genomics Data (FGED) Society (http://www.mged.org/, which was founded in 1999 as the Microarray Gene Expression Data (MGED) Society). Another lesson has been that it is important to allow ‘inheritance’ of database information, and to ease linking with previously published resources (e.g., via PubMed). Protocol description should be facilitated, wherever possible, by use of standard templates, or reuse of existing protocols (with optional modifications). However, care must be taken not to lose information regarding the rationale behind a researcher’s experimental choices, such as study design, conditions, and protocols, as this is critically important for understanding. Such meta-data may not come across checklists and tables, but rather only through original narrative, so appropriate use of freeform text fields is essential, especially for protocol description. Furthermore, it is conceivable that codification of reporting might promote adoption of preset patterns that could impact imagination and creativity. So, a workable compromise must be sought, as loosely prescribed sections may encourage substitution of jargon, abbreviation, shorthand, and ambiguously terse description for a full explanation. Related to this is the worry that, as a secondary source implemented in an online database, MICEE data will be subject to errors, omissions, and misrepresentations that would not occur with peer-reviewed publication. Peer-reviewed publications are not free of inaccuracies themselves, of course, and the only truly reliable source is the ‘original’ – the investigator who performed the studies. Discrepancies between peer-review and MICEE reporting would be minimised by explicitly linking publication of papers and database sets. Curation of the MICEE database will remain a critical issue (experience with other repositories, for instance the CellML model repository, has shown that only verified entries tend to be reliable sources), especially for studies without an associated publication, and a mechanism for report checking will need to be developed. These are all areas where it will be useful to adopt technologies already under development or in use by the MIBBI community.

CONCLUSION

The time is ripe for open-access sharing of published data in the cardiac electrophysiology community. The field would benefit from Minimum Information Standards and reporting guidelines. Successful efforts in other research areas have hinged on general acceptance of, and compliance to, such reporting standards. Cardiac experimental electrophysiology does not currently have a well-defined Minimum Information Standard, and as a step toward establishing this, we propose the Minimum Information about a Cardiac Electrophysiology Experiment (MICEE; see the draft presented in Appendix A, for consideration and development by the greater community). A considered user interface is hoped to make compliance as pain-free as possible, and we hope that with time this approach will manifest itself as an improvement over current practice. As an initial test of its utility, during 2012, the authors of this communication will adhere to the then identified standard, and we invite the reader to join this effort, by evaluating and implementing the Minimum Information about a Cardiac Electrophysiology Experiment standard.
  28 in total

1.  A vision and strategy for the virtual physiological human in 2010 and beyond.

Authors:  Peter Hunter; Peter V Coveney; Bernard de Bono; Vanessa Diaz; John Fenner; Alejandro F Frangi; Peter Harris; Rod Hose; Peter Kohl; Pat Lawford; Keith McCormack; Miriam Mendes; Stig Omholt; Alfio Quarteroni; John Skår; Jesper Tegner; S Randall Thomas; Ioannis Tollis; Ioannis Tsamardinos; Johannes H G M van Beek; Marco Viceconti
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2010-06-13       Impact factor: 4.226

Review 2.  Systems biology: an approach.

Authors:  P Kohl; E J Crampin; T A Quinn; D Noble
Journal:  Clin Pharmacol Ther       Date:  2010-06-09       Impact factor: 6.875

3.  ACC/AHA/HRS 2006 key data elements and definitions for electrophysiological studies and procedures: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Data Standards (ACC/AHA/HRS Writing Committee to Develop Data Standards on Electrophysiology).

Authors:  Alfred E Buxton; Hugh Calkins; David J Callans; John P DiMarco; John D Fisher; H Leon Greene; David E Haines; David L Hayes; Paul A Heidenreich; John M Miller; Athena Poppas; Eric N Prystowsky; Mark H Schoenfeld; Peter J Zimetbaum; David C Goff; Frederick L Grover; David J Malenka; Eric D Peterson; Martha J Radford; Rita F Redberg
Journal:  Circulation       Date:  2006-11-27       Impact factor: 29.690

4.  Knowledge capture and the responsibility: past and present.

Authors:  Nic Smith; Denis Noble
Journal:  Prog Biophys Mol Biol       Date:  2007-07-22       Impact factor: 3.667

5.  Data sharing: Empty archives.

Authors:  Bryn Nelson
Journal:  Nature       Date:  2009-09-10       Impact factor: 49.962

6.  ACCF/AHA 2011 key data elements and definitions of a base cardiovascular vocabulary for electronic health records: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Clinical Data Standards.

Authors:  William S Weintraub; Ronald P Karlsberg; James E Tcheng; Jeffrey R Boris; Alfred E Buxton; James T Dove; Gregg C Fonarow; Lee R Goldberg; Paul Heidenreich; Robert C Hendel; Alice K Jacobs; William Lewis; Michael J Mirro; David M Shahian; Robert C Hendel; Biykem Bozkurt; Jeffrey P Jacobs; Pamela N Peterson; Véronique L Roger; Eric E Smith; James E Tcheng; Tracy Wang
Journal:  Circulation       Date:  2011-06-06       Impact factor: 29.690

7.  Unpublished results hide the decline effect.

Authors:  Jonathan Schooler
Journal:  Nature       Date:  2011-02-24       Impact factor: 49.962

Review 8.  From genome to physiome: integrative models of cardiac excitation.

Authors:  Y Rudy
Journal:  Ann Biomed Eng       Date:  2000-08       Impact factor: 3.934

9.  Axial stretch of rat single ventricular cardiomyocytes causes an acute and transient increase in Ca2+ spark rate.

Authors:  Gentaro Iribe; Christopher W Ward; Patrizia Camelliti; Christian Bollensdorff; Fleur Mason; Rebecca A B Burton; Alan Garny; Mary K Morphew; Andreas Hoenger; W Jonathan Lederer; Peter Kohl
Journal:  Circ Res       Date:  2009-02-05       Impact factor: 17.367

10.  Systems biology and the virtual physiological human.

Authors:  Peter Kohl; Denis Noble
Journal:  Mol Syst Biol       Date:  2009-07-28       Impact factor: 11.429

View more
  28 in total

Review 1.  Human cardiac systems electrophysiology and arrhythmogenesis: iteration of experiment and computation.

Authors:  Katherine M Holzem; Eli J Madden; Igor R Efimov
Journal:  Europace       Date:  2014-11       Impact factor: 5.214

2.  In silico assessment of drug safety in human heart applied to late sodium current blockers.

Authors:  Beatriz Trenor; Julio Gomis-Tena; Karen Cardona; Lucia Romero; Sridharan Rajamani; Luiz Belardinelli; Wayne R Giles; Javier Saiz
Journal:  Channels (Austin)       Date:  2013 Jul-Aug       Impact factor: 2.581

3.  Minimum information required for a DMET experiment reporting.

Authors:  Judit Kumuthini; Mamana Mbiyavanga; Emile R Chimusa; Jyotishman Pathak; Panu Somervuo; Ron Hn Van Schaik; Vita Dolzan; Clint Mizzi; Kusha Kalideen; Raj S Ramesar; Milan Macek; George P Patrinos; Alessio Squassina
Journal:  Pharmacogenomics       Date:  2016-08-22       Impact factor: 2.533

4.  Data Management in Computational Systems Biology: Exploring Standards, Tools, Databases, and Packaging Best Practices.

Authors:  Natalie J Stanford; Martin Scharm; Paul D Dobson; Martin Golebiewski; Michael Hucka; Varun B Kothamachu; David Nickerson; Stuart Owen; Jürgen Pahle; Ulrike Wittig; Dagmar Waltemath; Carole Goble; Pedro Mendes; Jacky Snoep
Journal:  Methods Mol Biol       Date:  2019

Review 5.  Deranged sodium to sudden death.

Authors:  Colleen E Clancy; Ye Chen-Izu; Donald M Bers; Luiz Belardinelli; Penelope A Boyden; Laszlo Csernoch; Sanda Despa; Bernard Fermini; Livia C Hool; Leighton Izu; Robert S Kass; W Jonathan Lederer; William E Louch; Christoph Maack; Alicia Matiazzi; Zhilin Qu; Sridharan Rajamani; Crystal M Rippinger; Ole M Sejersted; Brian O'Rourke; James N Weiss; András Varró; Antonio Zaza
Journal:  J Physiol       Date:  2015-03-15       Impact factor: 5.182

Review 6.  Computational approaches to understand cardiac electrophysiology and arrhythmias.

Authors:  Byron N Roberts; Pei-Chi Yang; Steven B Behrens; Jonathan D Moreno; Colleen E Clancy
Journal:  Am J Physiol Heart Circ Physiol       Date:  2012-08-10       Impact factor: 4.733

7.  Electrotonic coupling of excitable and nonexcitable cells in the heart revealed by optogenetics.

Authors:  T Alexander Quinn; Patrizia Camelliti; Eva A Rog-Zielinska; Urszula Siedlecka; Tommaso Poggioli; Eileen T O'Toole; Thomas Knöpfel; Peter Kohl
Journal:  Proc Natl Acad Sci U S A       Date:  2016-12-07       Impact factor: 11.205

8.  mRNA expression levels in failing human hearts predict cellular electrophysiological remodeling: a population-based simulation study.

Authors:  John Walmsley; Jose F Rodriguez; Gary R Mirams; Kevin Burrage; Igor R Efimov; Blanca Rodriguez
Journal:  PLoS One       Date:  2013-02-20       Impact factor: 3.240

Review 9.  Application of cardiac electrophysiology simulations to pro-arrhythmic safety testing.

Authors:  Gary R Mirams; Mark R Davies; Yi Cui; Peter Kohl; Denis Noble
Journal:  Br J Pharmacol       Date:  2012-11       Impact factor: 8.739

Review 10.  Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies.

Authors:  T Alexander Quinn; Peter Kohl
Journal:  Cardiovasc Res       Date:  2013-01-17       Impact factor: 10.787

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.