MacからプリンターのIPを探す方法

dns-sdは、Macに接続しているデバイスIPアドレスを探すのに、便利

対応内容

1. dns-sdの対応しているサービスを表示

$ dns-sd  -B  _services._dns-sd._udp  local.

  Browsing for _services._dns-sd._udp.local.
  DATE: ---Fri 20 May 2016---
  23:54:37.797  ...STARTING...
  Timestamp     A/R Flags  if Domain  Service Type         Instance Name
  23:54:37.798  Add     3   4 .       _tcp.local.          _ssh
  23:54:37.798  Add     3   4 .       _tcp.local.          _sftp-ssh
  23:54:37.798  Add     3   4 .       _tcp.local.          _airplay
  23:54:37.798  Add     3   4 .       _tcp.local.          _raop
  23:54:37.798  Add     3   4 .       _tcp.local.          _googlecast
  23:54:37.798  Add     3   4 .       _tcp.local.          _nomachine
  23:54:37.798  Add     3   4 local.  _apple-mobdev2._tcp. _sub
  23:54:37.798  Add     3   4 .       _tcp.local.          _apple-mobdev2
  23:54:37.798  Add     3   4 .       _tcp.local.          _ipps
  23:54:37.798  Add     3   4 .       _tcp.local.          _ipp
  23:54:37.798  Add     3   4 .       _tcp.local.          _workstation
  23:54:37.798  Add     3   4 .       _tcp.local.          _http
  23:54:37.798  Add     3   4 .       _tcp.local.          _amzn-wplay
  23:54:37.798  Add     3   4 .       _tcp.local.          _rfb
  23:54:37.798  Add     3   4 .       _tcp.local.          _afpovertcp
  23:54:37.798  Add     3   4 .       _tcp.local.          _smb
  23:54:37.798  Add     3   4 .       _udp.local.          _net-assistant
  23:54:37.798  Add     2   4 .       _tcp.local.          _eppc
  ^C

2. デバイスインスタンス名を探す

$ dns-sd  -B  _ipp._tcp  local.

  Browsing for _ipp._tcp.local.
  DATE: ---Fri 20 May 2016---
  23:54:47.782  ...STARTING...
  Timestamp     A/R  Flags  if Domain  Service Type   Instance Name
  23:54:47.783  Add      3   4 local.  _ipp._tcp.     Officejet6500donna @ mbp
  23:54:47.783  Add      3   4 local.  _ipp._tcp.     raw2dir @ mbp
  23:54:47.783  Add      2   4 local.  _ipp._tcp.     Officejet 6600 @ mbp
  ^C

3. デバイスインスタンス名から、詳しいIP情報を取得

$ dns-sd  -L  "Officejet 6600 @ mbp"  _ipp._tcp  local.

  Lookup Officejet 6600 @ mbp._ipp._tcp.local.
  DATE: ---Fri 20 May 2016---
  23:54:55.252  ...STARTING...
  23:54:55.253  Officejet\0326600\032@\032mbp._ipp._tcp.local. can be reached at mbp2-2.local.:631 (interface 4)
   txtvers=1 qtotal=1 rp=printers/Officejet_6600 ty=Unknown \
    adminurl=https://mbp2-2.local.:631/printers/Officejet_6600 \
     note=Büro\ im\ Keller priority=0 product=\(Officejet\ 6600\ e-All-in-One\) \
      pdl=application/octet-stream,application/pdf,application/postscript,image/jpeg,image/png,image/pwg-raster \
       UUID=e7d11337-a440-3f2d-7168-b53de4325791 TLS=1.2 Color=T Scan=T \
        printer-state=3 printer-type=0x480900E
  ^C

4. IPアドレス(v4もv6も可)を取得

dns-sd -Gv4v6 mbp2-2.local

  DATE: ---Sat 21 May 2016---
  0:12:41.025  ...STARTING...
  Timestamp     A/R Flags if Hostname    Address                                      TTL
  0:12:41.025  Add  3  4 mbp2-2.local.   FE80:0000:0000:0000:AEBC:32FF:FEAE:CAEB%en0  120
  0:12:41.025  Add  3  4 mbp2-2.local.   FD00:0000:0000:0000:AEBC:32FF:FEAE:CAEB%  120
  0:12:41.025  Add  2  4 mbp2-2.local.   192.168.177.20                               120
  ^C

apple.stackexchange.com

DNS Service Discovery (DNS-SD)

netbsd.gw.com

djangoのtest(django-nose, coverage)を使用する時

djangoでtestをするとき、django-nose、coverageが使える。

$ pip install nose
$ pip install django-nose
$ pip install coverage
  • settigns.py
INSTALLED_APPS += ['django_nose', ]
TEST_RUNNER = 'django_nose.NoseTestSuiteRunner'
NOSE_ARGS = [
    '--with-coverage',  # coverage を取る
    '--cover-html',
    '--cover-package=app.search',
    '--nocapture',
    '--nologcapture',
]

tests.py

import unittest
from django.test import Client

class SimpleTest(unittest.TestCase):
    def setUp(self):
        # Every test needs a client.
        self.client = Client()

    def test_details(self):
        # Issue a GET request.
        response = self.client.get('/customer/details/')

        # Check that the response is 200 OK.
        self.assertEqual(response.status_code, 200)

        # Check that the rendered context contains 5 customers.
        self.assertEqual(len(response.context['customers']), 5)

上記設定をした後に、以下を回す。

$ python apps/manage.py test app --settings=product.settings
$ tree cover/
cover/
├── coverage_html.js
├── index.html
├── app_views_py.html
├── jquery.hotkeys.js
├── jquery.isonscreen.js
├── jquery.min.js
├── jquery.tablesorter.min.js
├── keybd_closed.png
├── keybd_open.png
├── status.json
└── style.css

$ cd cover/
$ python -m http.server 9090

remotestance.com

http.server --- HTTP サーバ — Python 3.8.0 ドキュメント

Tips

  • django.test.TestCase は、unittest.TestCase と違い、関数内のテストで保存されたdatabaseをclear する。
  • --failfastをオプションに追加することで、errror や fail 次第、graceful exit する

1億行のファイルを読み込む方法

ファイルを書き出す

  • manyrowstsv.py
#!/usr/bin/env python
# coding:utf-8

import signal
import sys
import os
import glob
import logging
import logging.handlers
import shutil
import tempfile
import random
import datetime
import string
import click
import itertools


def write_str_into_file(
    iterable,
    output_filename,
):
    with tempfile.NamedTemporaryFile(delete=False, dir='/var/tmp',) as f:
        for row in iterable:
            f.write(row)
        shutil.move(f.name, output_filename)
    if os.path.exists(f.name):
        os.remove(f.name)


class SignalException(Exception):
    def __init__(self, message):
        super(SignalException, self).__init__(message)


def do_exit(sig, stack):
    raise SignalException("Exiting")


class TsvRowGenerator(object):

    def __init__(
        self, dt_iso_max, dt_iso_min, date_iso_max, date_iso_min,
            ):
        self.dt_iso_max = datetime.datetime.strptime(
            dt_iso_max, '%Y/%m/%d %H:%M:%S')
        self.dt_iso_min = datetime.datetime.strptime(
            dt_iso_min, '%Y/%m/%d %H:%M:%S')
        self.date_iso_max = datetime.datetime.strptime(
            date_iso_max, '%Y/%m/%d')
        self.date_iso_min = datetime.datetime.strptime(
            date_iso_min, '%Y/%m/%d')
        delta = self.dt_iso_max - self.dt_iso_min
        date_delta = self.date_iso_max - self.date_iso_min
        self.int_delta = (delta.days * 24 * 60 * 60) + delta.seconds
        self.int_date_delta = (date_delta.days * 24 * 60 * 60) + \
            date_delta.seconds

    def iterows(self):
        yield (
            "\t".join(
                ["int", "short", "long", "double", "bool",
                    "char", "utf8", "dt_iso8601", "date_iso8601"]
                ) + "\n")
        while 1:
            rdp = random.randint(0, (1 << 32) - 1)
            random_second = rdp % self.int_delta
            randomtime = self.dt_iso_min + datetime.timedelta(
                seconds=random_second)
            random_date_second = rdp % self.int_date_delta
            randomdatetime = self.date_iso_min + datetime.timedelta(
                seconds=random_date_second)
            yield ("\t".join(
                [
                    str(rdp - (1 << 31)),
                    str((rdp >> 16) - (1 << 15)),
                    str(rdp - (1 << 31)),
                    str(random.uniform(0.1, 2.7)),
                    str(rdp % 2),
                    random.choice(
                        string.ascii_letters) + random.choice(
                        string.ascii_letters) + random.choice(
                        string.ascii_letters) + random.choice(
                        string.ascii_letters),
                    u"ごんた".encode('utf-8'),
                    randomtime.strftime("%Y-%m-%d %H:%M:%S"),
                    randomdatetime.strftime("%Y-%m-%d"),
                ]) + "\n")


@click.command()
@click.argument('rows', type=int, default=1000000)
@click.option(
    '-f', '--filename',
    default="~/kadai_1.tsv",
    )
@click.option('-D', '--dt-iso-max', default="2016/12/31 00:00:00")
@click.option('-d', '--dt-iso-min', default="2016/12/1 00:00:00")
@click.option('-T', '--date-iso-max', default="2016/12/31")
@click.option('-t', '--date-iso-min', default="2016/12/1")
def cmd(rows, filename, dt_iso_max, dt_iso_min,
        date_iso_max, date_iso_min):
    LOG_MANYROWSTSV = 'logging_warning.out'
    my_logger = logging.getLogger('MyLogger')
    my_logger.setLevel(logging.WARNING)
    handler = logging.handlers.RotatingFileHandler(
        LOG_MANYROWSTSV, maxBytes=2000, backupCount=5,)
    my_logger.addHandler(handler)
    s = datetime.datetime.now()
    print s + datetime.timedelta(0, 0, 0, 0, 0, 9)
    signal.signal(signal.SIGINT, do_exit)
    signal.signal(signal.SIGHUP, do_exit)
    signal.signal(signal.SIGTERM, do_exit)
    try:
        write_str_into_file(
            iterable=itertools.islice(
                TsvRowGenerator(
                    dt_iso_max, dt_iso_min, date_iso_max, date_iso_min,
                ).iterows(), rows + 1),
            output_filename=os.path.abspath(os.path.expanduser(filename)),)
        print os.path.abspath(os.path.expanduser(filename))
    except SignalException as e1:
        my_logger.warning('%s: %s' % (e1, datetime.datetime.now()))
        logfiles = glob.glob('%s*' % LOG_MANYROWSTSV)
        print logfiles
        sys.exit(1)
    finally:
        e = datetime.datetime.now()
        print str(e-s)


def main():
    cmd()


if __name__ == '__main__':
    main()
  • parselargetsv.py
#!/usr/bin/env python
# coding:utf-8

import signal
import sys
import os
import glob
import logging
import logging.handlers
import csv
import datetime
import click
import pickle
import struct
from manyrowstsv import write_str_into_file


class ParseRowsTsv(object):

    def __init__(
        self, file, inputf, outputf
            ):
        self.inputf = os.path.abspath(os.path.expanduser(inputf))
        self.outputf = os.path.abspath(os.path.expanduser(outputf))
        self.file = file

    def write_into_file(self):
        if self.file == 'pickle':
            write_str_into_file(self.pickle_tsv(), self.outputf)
        elif self.file == 'struct':
            write_str_into_file(self.struct_tsv(), self.outputf)

    def read_tsv(self):
        with open(self.inputf, "rb") as f:
            reader = csv.reader(f, delimiter="\t", lineterminator='\n')
            yield reader.next()
            for row in reader:
                row = (
                    int(row[0]),
                    int(row[1]),
                    int(row[2]),
                    float(row[3]),
                    int(row[4]),
                    row[5],
                    row[6],
                    row[7],
                    row[8],
                )
                yield row

    def pickle_tsv(self):
        for record in self.read_tsv():
            yield pickle.dumps(record)

    def struct_tsv(self):
        lines = self.read_tsv()
        line = lines.next()
        inits = struct.Struct(
            's '.join(
                [str(len(line[i])) for i in range(9)]) + 's')
        yield inits.pack(*line)
        for record in lines:
            s = struct.Struct(
                'i h l d ? %ds %ds %ds %ds' % (
                    len(record[5]), len(record[6]),
                    len(record[7]), len(record[8]),
                    )
                )
            yield s.pack(*record)


class SignalException(Exception):
    def __init__(self, message):
        super(SignalException, self).__init__(message)


def do_exit(sig, stack):
    raise SignalException("Exiting")


@click.command()
@click.option(
    '--file', type=click.Choice(['pickle', 'struct']),
    default='pickle')
@click.option('-i', '--inputf', default='~/kadai_1.tsv')
@click.option('-o', '--outputf', default='~/kadai_2.p')
def cmd(file, inputf, outputf):
    s = datetime.datetime.now()
    print s + datetime.timedelta(0, 0, 0, 0, 0, 9)
    # シグナル
    signal.signal(signal.SIGINT, do_exit)
    signal.signal(signal.SIGHUP, do_exit)
    signal.signal(signal.SIGTERM, do_exit)
    # ログハンドラーを設定する
    LOG_MANYROWSTSV = 'logging_warning.out'
    my_logger = logging.getLogger('MyLogger')
    my_logger.setLevel(logging.WARNING)
    handler = logging.handlers.RotatingFileHandler(
        LOG_MANYROWSTSV, maxBytes=2000, backupCount=5,)
    my_logger.addHandler(handler)

    parser = ParseRowsTsv(file, inputf, outputf)

    try:
        parser.write_into_file()

    except SignalException as e1:
        my_logger.warning('%s: %s' % (e1, datetime.datetime.now()))
        logfiles = glob.glob('%s*' % LOG_MANYROWSTSV)
        print logfiles
        sys.exit(1)
    finally:
        e = datetime.datetime.now()
        print str(e-s)


def main():
    cmd()


if __name__ == '__main__':
    main()

ファイルを読み込む

  • parsetsv_multitask_p3.py
#!/usr/bin/env python3
# coding:utf-8

import signal
import sys
import os
import glob
import logging
import logging.handlers
import datetime
import click
import pickle
import struct
import tempfile
import shutil
import math
import concurrent.futures
import errno


class SignalException(Exception):
    def __init__(self, message):
        super(SignalException, self).__init__(message)


def do_exit(sig, stack):
    raise SignalException("Exiting")


# ファイルを分割し、index、offset、offset + lengthを返す。
def tsv_separate_generator(inputf):
    CHUNK_SIZE = 1024 * 1024 * 100
    with open(inputf, 'rb') as f:
        f_size = os.stat(f.fileno()).st_size
        split_count = math.ceil(f_size / CHUNK_SIZE)
        start_offset = len(f.readline())
        for split_idx in range(split_count):
            offset = CHUNK_SIZE * (split_idx + 1) - 1
            f.seek(offset)
            last_line_len = len(f.readline())
            if offset < f_size:
                end_offset = offset + last_line_len
            else:
                end_offset = f_size
            yield (
                split_idx,
                start_offset,
                end_offset,
            )
            if end_offset >= f_size or last_line_len == 0:
                break
            start_offset = end_offset


def sum_file(self, files):
    with tempfile.NamedTemporaryFile(delete=False, dir='/var/tmp/',) as f:
        s = 0
        for file in self.files:
            with open(file) as f1:
                os.sendfile(f.fileno(), f1.fileno(), s)
            s += os.stat(file).st_size
        return f.name


class ReadTsvGenerator(object):

    def __init__(self, inputf, iterable):
        self.inputf = inputf
        self.iterable = iterable

    def read_tsv(self):
        with open(self.inputf, "rb") as f:
            start_offset = self.iterable[1],
            end_offset = self.iterable[2],
            f.seek(start_offset[0])
            start = start_offset[0]
            while start < end_offset[0]:
                row = f.readline()
                start += len(row)
                row = [
                    i.decode(
                        'utf-8'
                    ) for i in row.strip(b'\n').split(b'\t')
                    ]
                row = (
                    int(row[0]),
                    int(row[1]),
                    int(row[2]),
                    float(row[3]),
                    int(row[4]),
                    row[5],
                    row[6],
                    row[7],
                    row[8],
                )
                yield row


class ParseTsvGenerator(object):
    def __init__(self, iterable):
        self.iterable = iterable

    def pickle_tsv(self):
        lines = self.iterable
        next(lines)
        for record in lines:
            yield pickle.dumps(record)

    def struct_tsv(self):
        lines = self.iterable
        next(lines)
        for record in lines:
            s = struct.Struct(
                'i h l d ? %ds %ds %ds %ds' % (
                    len(record[5]), len(record[6]),
                    len(record[7]), len(record[8]),
                    )
                )
            yield s.pack(*record)


class ParseRowsTsv(object):

    def __init__(self, file, inputf, outputf):
        self.file = file
        self.inputf = os.path.abspath(os.path.expanduser(inputf))
        self.outputf = os.path.abspath(os.path.expanduser(outputf))

    # 単一タスク
    def dotask(self, rule):
        parsetsv = ParseTsvGenerator(
            ReadTsvGenerator(self.inputf, rule).read_tsv())
        if self.file == 'pickle':
            w = parsetsv.pickle_tsv()
        elif self.file == 'struct':
            w = parsetsv.struct_tsv()
        with tempfile.NamedTemporaryFile(
            delete=False, dir='/var/tmp', suffix='_dotask', prefix='tmp_',
                ) as f:
            for row in w:
                f.write(row)
            return f.name

    # マルチプロセス
    def multi_do_task(self):
        with concurrent.futures.ProcessPoolExecutor() as executor:
            future_to_tsv = {
                executor.submit(
                    self.dotask, rule
                ): rule for rule in tsv_separate_generator(self.inputf)}
            with tempfile.TemporaryDirectory(
                    suffix='_tsv', prefix='tmp_', dir='/var/tmp') as temp_dir:
                with tempfile.NamedTemporaryFile(
                        suffix='_tsv', prefix='tmp_',
                        delete=False, dir=temp_dir,) as f:
                    s = 0
                    for future in concurrent.futures.as_completed(
                            future_to_tsv):
                        chunk = future_to_tsv[future][2] - \
                            future_to_tsv[future][1]
                        with open(future.result()) as separatefile:
                            os.sendfile(
                                f.fileno(), separatefile.fileno(), s, chunk)
                            s += os.stat(separatefile.fileno()).st_size
                        try:
                            os.remove(separatefile.name)
                        except OSError as exc:
                            if exc.errno != errno.ENOENT:
                                raise
                    shutil.move(f.name, self.outputf)


@click.command()
@click.option(
    '--file', type=click.Choice(['pickle', 'struct']),
    default='pickle')
@click.option('-i', '--inputf', default='~/kadai_1.tsv')
@click.option('-o', '--outputf', default='~/zone/kadai_2v3.p')
def cmd(file, inputf, outputf):
    s = datetime.datetime.now()
    print(s + datetime.timedelta(0, 0, 0, 0, 0, 9))
    # シグナル
    signal.signal(signal.SIGINT, do_exit)
    signal.signal(signal.SIGHUP, do_exit)
    signal.signal(signal.SIGTERM, do_exit)
    # ログハンドラーを設定する
    LOG_MANYROWSTSV = 'logging_warning.out'
    my_logger = logging.getLogger('MyLogger')
    my_logger.setLevel(logging.WARNING)
    handler = logging.handlers.RotatingFileHandler(
        LOG_MANYROWSTSV, maxBytes=2000, backupCount=5,)
    my_logger.addHandler(handler)

    parser = ParseRowsTsv(file, inputf, outputf)

    try:
        parser.multi_do_task()

    except SignalException as e1:
        my_logger.warning('%s: %s' % (e1, datetime.datetime.now()))
        logfiles = glob.glob('%s*' % LOG_MANYROWSTSV)
        print(logfiles)
        sys.exit(1)
    finally:
        e = datetime.datetime.now()
        print(str(e-s))


def main():
    cmd()


if __name__ == '__main__':
    main()

最大流について

問題

s => t に最大量のデータを流す場合、最大どれだけのデータを送信できるか。

class MaxTraffic(object):
    def __init__(self, N=5):
        self.N = N
        self.edge = [[] for i in range(N)]
        self.used = [0 for i in range(N)]
        self.inf = 10 ** 9

    def append(self, _from, _to, cost):
        self.edge[_from].append(
            {
                'to': _to,
                'cap': cost,
                'rev': len(self.edge[_to]),
            }
        )
        self.edge[_to].append(
            {
                'to': _from,
                'cap': 0,
                'rev': len(self.edge[_from]) - 1,
            }
        )

    def dfs(self, v, t, f):
        """
        増加パスをDFSで探す
        """
        if v == t:
            return f
        self.used[v] = 1
        for i in range(len(self.edge[v])):
            e = self.edge[v][i]
            if not self.used[e['to']] and e['cap'] > 0:
                d = self.dfs(e['to'], t, min(f, e['cap']))
                if d > 0:
                    e['cap'] -= d
                    self.edge[e['to']][e['rev']]['cap'] += d
                    return d
        return 0

    def max_flow(self, s, t):
        flow = 0
        while 1:
            self.used = [0 for i in range(self.N)]
            f = self.dfs(s, t, self.inf)
            if f == 0:
                return flow
            flow += f

    def sample_append(self):
        self.append(0, 1, 10)
        self.append(0, 2, 2)
        self.append(1, 2, 6)
        self.append(1, 3, 6)
        self.append(2, 4, 5)
        self.append(3, 2, 3)
        self.append(3, 4, 8)
In [166]: m = MaxTraffic()
In [166]: m = MaxTraffic()

In [167]: m.sample_append()
In [167]: m.sample_append()

In [168]: m.max_flow(0, 4)
In [168]: m.max_flow(0, 4)
Out[168]: 11

最小値の最大化(二分探索)と、反転操作回数の最小化

面白い問題があったのでPythonで記載。

問題 1

N個の牛小屋について、M頭の牛を買っている。 x・・・牛小屋の位置 最も近い牛の間隔を最大化するためにどうするか

解法

最小・最大化問題について、収束判定するために二分探索が用いられるらしい --> 総当たり的な感じ

class IntervalOptimize(object):
    def __init__(self, N, M, x_):
        self.N = N
        self.M = M
        self.x_ = sorted(x_)
        self._max = max(self.x_)

    def assetion(self, d):
        last = 0
        for i in range(self.M):
            crt = last + 1
            while (crt < self.N) and (self.x_[crt] - self.x_[last] < d):
                crt += 1
            if crt > self.N:
                return False
            last = crt
        return True

    def solve(self):
        lb, ub = 0, self._max

        while ub - lb > 1:
            mid = (ub + lb) // 2
            if self.assetion(mid):
                lb = mid
            else:
                ub = mid

        return lb
In [114]: i = IntervalOptimize(5, 3, [1,2,8,4,9])

In [115]: i.solve()

Out[115]: 3

問題 2

N頭に並んでいる牛が前か後ろ向いている(1: 前、2: 後ろ)。 連続するK頭ずつ反転させることができる。 最小の操作回数Mと、それを達成するための最小のKを求めよ。

解法

反転する順番は順不同なため、はじめから順に反転させる。あとは、貪欲に。。

class CowReverse(object):
    def __init__(self, N, direction_):
        self.N = N
        self.direction_ = direction_

    def calc(self, k):
        f = [0 for i in range(self.N)]
        res = 0
        _sum = 0

        for i in range(self.N - k + 1):
            if (self.direction_[i] + _sum) % 2 != 0:
                res += 1
                f[i] = 1
            _sum += f[i]
            if i - k + 1 >= 0:
                _sum -= f[i - k + 1]

        for i in range(self.N - k + 1, self.N):
            if (self.direction_[i] + _sum) % 2 != 0:
                return -1
            _sum += f[i]
            if i - k + 1 >= 0:
                _sum -= f[i - k + 1]

        return res

    def solve(self):
        K = 1
        M = self.N
        for k in range(1, self.N + 1):
            m = self.calc(k)
            if m >= 0 and M > m:
                K = k
                M = m

        return '{:d} {:d}'.format(K, M)
In [126]: direction_list = '1101011'

In [134]: c = CowReverse(len(direction_list), [int(i) for i in list(direction_list)])

In [135]: c.solve()
Out[135]: '3 3'