跟踪挂钩执行

此插件主要用于希望调试插件的插件作者。

它打印调用到stderr的每个钩子,以及传递给钩子的事件的详细信息。

为此,此插件将重写 nose2.events.Plugin.register() 注册后,替换所有现有的 nose2.events.Hook 实例 session.hooks 使用的实例 Hook 打印每个调用的信息的子类。

启用此插件

此插件是内置的,但默认情况下不加载。

即使您指定 always-on = True 在配置中,除非您也启用它,否则它不会运行。您可以通过将以下内容放入 unittest.cfgnose2.cfg 文件

[unittest]
plugins = nose2.plugins.printhooks

这个 plugins 参数可以包含插件名称列表,包括 nose2.plugins.printhooks

配置 [print-hooks]

always-on
违约

类型

布尔

示例配置

默认配置相当于在 unittest.cfg 文件。

[print-hooks]
always-on = False

命令行选项

-P DEFAULT, --print-hooks DEFAULT

按执行顺序打印挂钩名称

插件类引用:printhooks

class nose2.plugins.printhooks.PrintHooks(*args, **kwargs)[源代码]

打印挂钩

register()[源代码]

重写以注入有噪声的挂钩实例。

替换 Hook 实例 self.session.hooks.hooks 有噪音的物体。

样品输出

用于在Python模块中发现一个标准测试用例测试的测试运行的printhooks输出。

出现缩进的挂钩从其他挂钩中调用。

handleArgs: CommandLineArgsEvent(handled=False, args=Namespace(collect_only=None, config=['unittest.cfg', 'nose2.cfg'], debugger=None, fail_fast=None, load_plugins=True, log_level=30, print_hooks=None, profile=None, start_dir='.', testNames=[], top_level_directory=None, user_config=True, verbose=0, with_id=None))

createTests: CreateTestsEvent(loader=<PluggableTestLoader>, testNames=[], module=<module '__main__' from 'bin/nose2'>)

loadTestsFromNames: LoadFromNames(names=[], module=None)

  handleFile: HandleFileEvent(handled=False, loader=<PluggableTestLoader>, name='tests.py', path='nose2/tests/functional/support/scenario/one_test/tests.py', pattern='test*.py', topLevelDirectory='nose2/tests/functional/support/scenario/one_test')

  matchPath: MatchPathEvent(handled=False, name='tests.py', path='nose2/tests/functional/support/scenario/one_test/tests.py', pattern='test*.py')

  loadTestsFromModule: LoadFromModuleEvent(handled=False, loader=<PluggableTestLoader>, module=<module 'tests' from 'nose2/tests/functional/support/scenario/one_test/tests.py'>, extraTests=[])

    loadTestsFromTestCase: LoadFromTestCaseEvent(handled=False, loader=<PluggableTestLoader>, testCase=<class 'tests.Test'>, extraTests=[])

    getTestCaseNames: GetTestCaseNamesEvent(handled=False, loader=<PluggableTestLoader>, testCase=<class 'tests.Test'>, testMethodPrefix=None, extraNames=[], excludedNames=[], isTestMethod=<function isTestMethod at 0x1fccc80>)

  handleFile: HandleFileEvent(handled=False, loader=<PluggableTestLoader>, name='tests.pyc', path='nose2/tests/functional/support/scenario/one_test/tests.pyc', pattern='test*.py', topLevelDirectory='nose2/tests/functional/support/scenario/one_test')

runnerCreated: RunnerCreatedEvent(handled=False, runner=<PluggableTestRunner>)

resultCreated: ResultCreatedEvent(handled=False, result=<PluggableTestResult>)

startTestRun: StartTestRunEvent(handled=False, runner=<PluggableTestRunner>, suite=<unittest2.suite.TestSuite tests=[<unittest2.suite.TestSuite tests=[<unittest2.suite.TestSuite tests=[<tests.Test testMethod=test>]>]>]>, result=<PluggableTestResult>, startTime=1327346684.77457, executeTests=<function <lambda> at 0x1fccf50>)

startTest: StartTestEvent(handled=False, test=<tests.Test testMethod=test>, result=<PluggableTestResult>, startTime=1327346684.774765)

  reportStartTest: ReportTestEvent(handled=False, testEvent=<nose2.events.StartTestEvent object at 0x1fcd650>, stream=<nose2.util._WritelnDecorator object at 0x1f97a10>)

setTestOutcome: TestOutcomeEvent(handled=False, test=<tests.Test testMethod=test>, result=<PluggableTestResult>, outcome='passed', exc_info=None, reason=None, expected=True, shortLabel=None, longLabel=None)

testOutcome: TestOutcomeEvent(handled=False, test=<tests.Test testMethod=test>, result=<PluggableTestResult>, outcome='passed', exc_info=None, reason=None, expected=True, shortLabel=None, longLabel=None)

  reportSuccess: ReportTestEvent(handled=False, testEvent=<nose2.events.TestOutcomeEvent object at 0x1fcd650>, stream=<nose2.util._WritelnDecorator object at 0x1f97a10>)
.
stopTest: StopTestEvent(handled=False, test=<tests.Test testMethod=test>, result=<PluggableTestResult>, stopTime=1327346684.775064)

stopTestRun: StopTestRunEvent(handled=False, runner=<PluggableTestRunner>, result=<PluggableTestResult>, stopTime=1327346684.77513, timeTaken=0.00056004524230957031)

afterTestRun: StopTestRunEvent(handled=False, runner=<PluggableTestRunner>, result=<PluggableTestResult>, stopTime=1327346684.77513, timeTaken=0.00056004524230957031)


  beforeErrorList: ReportSummaryEvent(handled=False, stopTestEvent=<nose2.events.StopTestRunEvent object at 0x1eb0d90>, stream=<nose2.util._WritelnDecorator object at 0x1f97a10>, reportCategories={'failures': [], 'skipped': [], 'errors': [], 'unexpectedSuccesses': [], 'expectedFailures': []})
----------------------------------------------------------------------

  beforeSummaryReport: ReportSummaryEvent(handled=False, stopTestEvent=<nose2.events.StopTestRunEvent object at 0x1eb0d90>, stream=<nose2.util._WritelnDecorator object at 0x1f97a10>, reportCategories={'failures': [], 'skipped': [], 'errors': [], 'unexpectedSuccesses': [], 'expectedFailures': []})
Ran 1 test in 0.001s


  wasSuccessful: ResultSuccessEvent(handled=False, result=<PluggableTestResult>, success=False)
OK

  afterSummaryReport: ReportSummaryEvent(handled=False, stopTestEvent=<nose2.events.StopTestRunEvent object at 0x1eb0d90>, stream=<nose2.util._WritelnDecorator object at 0x1f97a10>, reportCategories={'failures': [], 'skipped': [], 'errors': [], 'unexpectedSuccesses': [], 'expectedFailures': []})